Jan 20 08:16:02 np0005588919 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 20 08:16:02 np0005588919 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 20 08:16:02 np0005588919 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:02 np0005588919 kernel: BIOS-provided physical RAM map:
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 20 08:16:02 np0005588919 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 20 08:16:02 np0005588919 kernel: NX (Execute Disable) protection: active
Jan 20 08:16:02 np0005588919 kernel: APIC: Static calls initialized
Jan 20 08:16:02 np0005588919 kernel: SMBIOS 2.8 present.
Jan 20 08:16:02 np0005588919 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 20 08:16:02 np0005588919 kernel: Hypervisor detected: KVM
Jan 20 08:16:02 np0005588919 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 20 08:16:02 np0005588919 kernel: kvm-clock: using sched offset of 3902502170 cycles
Jan 20 08:16:02 np0005588919 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 20 08:16:02 np0005588919 kernel: tsc: Detected 2800.000 MHz processor
Jan 20 08:16:02 np0005588919 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 20 08:16:02 np0005588919 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 20 08:16:02 np0005588919 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 20 08:16:02 np0005588919 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 20 08:16:02 np0005588919 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 20 08:16:02 np0005588919 kernel: Using GB pages for direct mapping
Jan 20 08:16:02 np0005588919 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 20 08:16:02 np0005588919 kernel: ACPI: Early table checksum verification disabled
Jan 20 08:16:02 np0005588919 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 20 08:16:02 np0005588919 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:02 np0005588919 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:02 np0005588919 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:02 np0005588919 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 20 08:16:02 np0005588919 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:02 np0005588919 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 20 08:16:02 np0005588919 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 20 08:16:02 np0005588919 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 20 08:16:02 np0005588919 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 20 08:16:02 np0005588919 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 20 08:16:02 np0005588919 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 20 08:16:02 np0005588919 kernel: No NUMA configuration found
Jan 20 08:16:02 np0005588919 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 20 08:16:02 np0005588919 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 20 08:16:02 np0005588919 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 20 08:16:02 np0005588919 kernel: Zone ranges:
Jan 20 08:16:02 np0005588919 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 20 08:16:02 np0005588919 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 20 08:16:02 np0005588919 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 20 08:16:02 np0005588919 kernel:  Device   empty
Jan 20 08:16:02 np0005588919 kernel: Movable zone start for each node
Jan 20 08:16:02 np0005588919 kernel: Early memory node ranges
Jan 20 08:16:02 np0005588919 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 20 08:16:02 np0005588919 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 20 08:16:02 np0005588919 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 20 08:16:02 np0005588919 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 20 08:16:02 np0005588919 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 20 08:16:02 np0005588919 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 20 08:16:02 np0005588919 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 20 08:16:02 np0005588919 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 20 08:16:02 np0005588919 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 20 08:16:02 np0005588919 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 20 08:16:02 np0005588919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 20 08:16:02 np0005588919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 20 08:16:02 np0005588919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 20 08:16:02 np0005588919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 20 08:16:02 np0005588919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 20 08:16:02 np0005588919 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 20 08:16:02 np0005588919 kernel: TSC deadline timer available
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Max. logical packages:   8
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Max. logical dies:       8
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Max. dies per package:   1
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Max. threads per core:   1
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Num. cores per package:     1
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Num. threads per package:   1
Jan 20 08:16:02 np0005588919 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 20 08:16:02 np0005588919 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 20 08:16:02 np0005588919 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 20 08:16:02 np0005588919 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 20 08:16:02 np0005588919 kernel: Booting paravirtualized kernel on KVM
Jan 20 08:16:02 np0005588919 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 20 08:16:02 np0005588919 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 20 08:16:02 np0005588919 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 20 08:16:02 np0005588919 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 20 08:16:02 np0005588919 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:02 np0005588919 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 20 08:16:02 np0005588919 kernel: random: crng init done
Jan 20 08:16:02 np0005588919 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: Fallback order for Node 0: 0 
Jan 20 08:16:02 np0005588919 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 20 08:16:02 np0005588919 kernel: Policy zone: Normal
Jan 20 08:16:02 np0005588919 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 20 08:16:02 np0005588919 kernel: software IO TLB: area num 8.
Jan 20 08:16:02 np0005588919 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 20 08:16:02 np0005588919 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 20 08:16:02 np0005588919 kernel: ftrace: allocated 194 pages with 3 groups
Jan 20 08:16:02 np0005588919 kernel: Dynamic Preempt: voluntary
Jan 20 08:16:02 np0005588919 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 20 08:16:02 np0005588919 kernel: rcu: #011RCU event tracing is enabled.
Jan 20 08:16:02 np0005588919 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 20 08:16:02 np0005588919 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 20 08:16:02 np0005588919 kernel: #011Rude variant of Tasks RCU enabled.
Jan 20 08:16:02 np0005588919 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 20 08:16:02 np0005588919 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 20 08:16:02 np0005588919 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 20 08:16:02 np0005588919 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:02 np0005588919 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:02 np0005588919 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 20 08:16:02 np0005588919 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 20 08:16:02 np0005588919 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 20 08:16:02 np0005588919 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 20 08:16:02 np0005588919 kernel: Console: colour VGA+ 80x25
Jan 20 08:16:02 np0005588919 kernel: printk: console [ttyS0] enabled
Jan 20 08:16:02 np0005588919 kernel: ACPI: Core revision 20230331
Jan 20 08:16:02 np0005588919 kernel: APIC: Switch to symmetric I/O mode setup
Jan 20 08:16:02 np0005588919 kernel: x2apic enabled
Jan 20 08:16:02 np0005588919 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 20 08:16:02 np0005588919 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 20 08:16:02 np0005588919 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 20 08:16:02 np0005588919 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 20 08:16:02 np0005588919 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 20 08:16:02 np0005588919 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 20 08:16:02 np0005588919 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 20 08:16:02 np0005588919 kernel: Spectre V2 : Mitigation: Retpolines
Jan 20 08:16:02 np0005588919 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 20 08:16:02 np0005588919 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 20 08:16:02 np0005588919 kernel: RETBleed: Mitigation: untrained return thunk
Jan 20 08:16:02 np0005588919 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 20 08:16:02 np0005588919 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 20 08:16:02 np0005588919 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 20 08:16:02 np0005588919 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 20 08:16:02 np0005588919 kernel: x86/bugs: return thunk changed
Jan 20 08:16:02 np0005588919 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 20 08:16:02 np0005588919 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 20 08:16:02 np0005588919 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 20 08:16:02 np0005588919 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 20 08:16:02 np0005588919 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 20 08:16:02 np0005588919 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 20 08:16:02 np0005588919 kernel: Freeing SMP alternatives memory: 40K
Jan 20 08:16:02 np0005588919 kernel: pid_max: default: 32768 minimum: 301
Jan 20 08:16:02 np0005588919 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 20 08:16:02 np0005588919 kernel: landlock: Up and running.
Jan 20 08:16:02 np0005588919 kernel: Yama: becoming mindful.
Jan 20 08:16:02 np0005588919 kernel: SELinux:  Initializing.
Jan 20 08:16:02 np0005588919 kernel: LSM support for eBPF active
Jan 20 08:16:02 np0005588919 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 20 08:16:02 np0005588919 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 20 08:16:02 np0005588919 kernel: ... version:                0
Jan 20 08:16:02 np0005588919 kernel: ... bit width:              48
Jan 20 08:16:02 np0005588919 kernel: ... generic registers:      6
Jan 20 08:16:02 np0005588919 kernel: ... value mask:             0000ffffffffffff
Jan 20 08:16:02 np0005588919 kernel: ... max period:             00007fffffffffff
Jan 20 08:16:02 np0005588919 kernel: ... fixed-purpose events:   0
Jan 20 08:16:02 np0005588919 kernel: ... event mask:             000000000000003f
Jan 20 08:16:02 np0005588919 kernel: signal: max sigframe size: 1776
Jan 20 08:16:02 np0005588919 kernel: rcu: Hierarchical SRCU implementation.
Jan 20 08:16:02 np0005588919 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 20 08:16:02 np0005588919 kernel: smp: Bringing up secondary CPUs ...
Jan 20 08:16:02 np0005588919 kernel: smpboot: x86: Booting SMP configuration:
Jan 20 08:16:02 np0005588919 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 20 08:16:02 np0005588919 kernel: smp: Brought up 1 node, 8 CPUs
Jan 20 08:16:02 np0005588919 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 20 08:16:02 np0005588919 kernel: node 0 deferred pages initialised in 15ms
Jan 20 08:16:02 np0005588919 kernel: Memory: 7763888K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618360K reserved, 0K cma-reserved)
Jan 20 08:16:02 np0005588919 kernel: devtmpfs: initialized
Jan 20 08:16:02 np0005588919 kernel: x86/mm: Memory block size: 128MB
Jan 20 08:16:02 np0005588919 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 20 08:16:02 np0005588919 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 20 08:16:02 np0005588919 kernel: pinctrl core: initialized pinctrl subsystem
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 20 08:16:02 np0005588919 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 20 08:16:02 np0005588919 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 20 08:16:02 np0005588919 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 20 08:16:02 np0005588919 kernel: audit: initializing netlink subsys (disabled)
Jan 20 08:16:02 np0005588919 kernel: audit: type=2000 audit(1768914960.411:1): state=initialized audit_enabled=0 res=1
Jan 20 08:16:02 np0005588919 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 20 08:16:02 np0005588919 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 20 08:16:02 np0005588919 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 20 08:16:02 np0005588919 kernel: cpuidle: using governor menu
Jan 20 08:16:02 np0005588919 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 20 08:16:02 np0005588919 kernel: PCI: Using configuration type 1 for base access
Jan 20 08:16:02 np0005588919 kernel: PCI: Using configuration type 1 for extended access
Jan 20 08:16:02 np0005588919 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 20 08:16:02 np0005588919 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 20 08:16:02 np0005588919 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 20 08:16:02 np0005588919 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 20 08:16:02 np0005588919 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 20 08:16:02 np0005588919 kernel: Demotion targets for Node 0: null
Jan 20 08:16:02 np0005588919 kernel: cryptd: max_cpu_qlen set to 1000
Jan 20 08:16:02 np0005588919 kernel: ACPI: Added _OSI(Module Device)
Jan 20 08:16:02 np0005588919 kernel: ACPI: Added _OSI(Processor Device)
Jan 20 08:16:02 np0005588919 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 20 08:16:02 np0005588919 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 20 08:16:02 np0005588919 kernel: ACPI: Interpreter enabled
Jan 20 08:16:02 np0005588919 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 20 08:16:02 np0005588919 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 20 08:16:02 np0005588919 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 20 08:16:02 np0005588919 kernel: PCI: Using E820 reservations for host bridge windows
Jan 20 08:16:02 np0005588919 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 20 08:16:02 np0005588919 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [3] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [4] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [5] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [6] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [7] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [8] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [9] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [10] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [11] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [12] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [13] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [14] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [15] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [16] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [17] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [18] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [19] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [20] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [21] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [22] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [23] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [24] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [25] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [26] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [27] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [28] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [29] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [30] registered
Jan 20 08:16:02 np0005588919 kernel: acpiphp: Slot [31] registered
Jan 20 08:16:02 np0005588919 kernel: PCI host bridge to bus 0000:00
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 20 08:16:02 np0005588919 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 20 08:16:02 np0005588919 kernel: iommu: Default domain type: Translated
Jan 20 08:16:02 np0005588919 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 20 08:16:02 np0005588919 kernel: SCSI subsystem initialized
Jan 20 08:16:02 np0005588919 kernel: ACPI: bus type USB registered
Jan 20 08:16:02 np0005588919 kernel: usbcore: registered new interface driver usbfs
Jan 20 08:16:02 np0005588919 kernel: usbcore: registered new interface driver hub
Jan 20 08:16:02 np0005588919 kernel: usbcore: registered new device driver usb
Jan 20 08:16:02 np0005588919 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 20 08:16:02 np0005588919 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 20 08:16:02 np0005588919 kernel: PTP clock support registered
Jan 20 08:16:02 np0005588919 kernel: EDAC MC: Ver: 3.0.0
Jan 20 08:16:02 np0005588919 kernel: NetLabel: Initializing
Jan 20 08:16:02 np0005588919 kernel: NetLabel:  domain hash size = 128
Jan 20 08:16:02 np0005588919 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 20 08:16:02 np0005588919 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 20 08:16:02 np0005588919 kernel: PCI: Using ACPI for IRQ routing
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 20 08:16:02 np0005588919 kernel: vgaarb: loaded
Jan 20 08:16:02 np0005588919 kernel: clocksource: Switched to clocksource kvm-clock
Jan 20 08:16:02 np0005588919 kernel: VFS: Disk quotas dquot_6.6.0
Jan 20 08:16:02 np0005588919 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 20 08:16:02 np0005588919 kernel: pnp: PnP ACPI init
Jan 20 08:16:02 np0005588919 kernel: pnp: PnP ACPI: found 5 devices
Jan 20 08:16:02 np0005588919 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_INET protocol family
Jan 20 08:16:02 np0005588919 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 20 08:16:02 np0005588919 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_XDP protocol family
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 20 08:16:02 np0005588919 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 20 08:16:02 np0005588919 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 20 08:16:02 np0005588919 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72832 usecs
Jan 20 08:16:02 np0005588919 kernel: PCI: CLS 0 bytes, default 64
Jan 20 08:16:02 np0005588919 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 20 08:16:02 np0005588919 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 20 08:16:02 np0005588919 kernel: ACPI: bus type thunderbolt registered
Jan 20 08:16:02 np0005588919 kernel: Trying to unpack rootfs image as initramfs...
Jan 20 08:16:02 np0005588919 kernel: Initialise system trusted keyrings
Jan 20 08:16:02 np0005588919 kernel: Key type blacklist registered
Jan 20 08:16:02 np0005588919 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 20 08:16:02 np0005588919 kernel: zbud: loaded
Jan 20 08:16:02 np0005588919 kernel: integrity: Platform Keyring initialized
Jan 20 08:16:02 np0005588919 kernel: integrity: Machine keyring initialized
Jan 20 08:16:02 np0005588919 kernel: Freeing initrd memory: 87956K
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_ALG protocol family
Jan 20 08:16:02 np0005588919 kernel: xor: automatically using best checksumming function   avx       
Jan 20 08:16:02 np0005588919 kernel: Key type asymmetric registered
Jan 20 08:16:02 np0005588919 kernel: Asymmetric key parser 'x509' registered
Jan 20 08:16:02 np0005588919 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 20 08:16:02 np0005588919 kernel: io scheduler mq-deadline registered
Jan 20 08:16:02 np0005588919 kernel: io scheduler kyber registered
Jan 20 08:16:02 np0005588919 kernel: io scheduler bfq registered
Jan 20 08:16:02 np0005588919 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 20 08:16:02 np0005588919 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 20 08:16:02 np0005588919 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 20 08:16:02 np0005588919 kernel: ACPI: button: Power Button [PWRF]
Jan 20 08:16:02 np0005588919 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 20 08:16:02 np0005588919 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 20 08:16:02 np0005588919 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 20 08:16:02 np0005588919 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 20 08:16:02 np0005588919 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 20 08:16:02 np0005588919 kernel: Non-volatile memory driver v1.3
Jan 20 08:16:02 np0005588919 kernel: rdac: device handler registered
Jan 20 08:16:02 np0005588919 kernel: hp_sw: device handler registered
Jan 20 08:16:02 np0005588919 kernel: emc: device handler registered
Jan 20 08:16:02 np0005588919 kernel: alua: device handler registered
Jan 20 08:16:02 np0005588919 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 20 08:16:02 np0005588919 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 20 08:16:02 np0005588919 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 20 08:16:02 np0005588919 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 20 08:16:02 np0005588919 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 20 08:16:02 np0005588919 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 20 08:16:02 np0005588919 kernel: usb usb1: Product: UHCI Host Controller
Jan 20 08:16:02 np0005588919 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 20 08:16:02 np0005588919 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 20 08:16:02 np0005588919 kernel: hub 1-0:1.0: USB hub found
Jan 20 08:16:02 np0005588919 kernel: hub 1-0:1.0: 2 ports detected
Jan 20 08:16:02 np0005588919 kernel: usbcore: registered new interface driver usbserial_generic
Jan 20 08:16:02 np0005588919 kernel: usbserial: USB Serial support registered for generic
Jan 20 08:16:02 np0005588919 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 20 08:16:02 np0005588919 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 20 08:16:02 np0005588919 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 20 08:16:02 np0005588919 kernel: mousedev: PS/2 mouse device common for all mice
Jan 20 08:16:02 np0005588919 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 20 08:16:02 np0005588919 kernel: rtc_cmos 00:04: registered as rtc0
Jan 20 08:16:02 np0005588919 kernel: rtc_cmos 00:04: setting system clock to 2026-01-20T13:16:01 UTC (1768914961)
Jan 20 08:16:02 np0005588919 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 20 08:16:02 np0005588919 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 20 08:16:02 np0005588919 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 20 08:16:02 np0005588919 kernel: usbcore: registered new interface driver usbhid
Jan 20 08:16:02 np0005588919 kernel: usbhid: USB HID core driver
Jan 20 08:16:02 np0005588919 kernel: drop_monitor: Initializing network drop monitor service
Jan 20 08:16:02 np0005588919 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 20 08:16:02 np0005588919 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 20 08:16:02 np0005588919 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 20 08:16:02 np0005588919 kernel: Initializing XFRM netlink socket
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_INET6 protocol family
Jan 20 08:16:02 np0005588919 kernel: Segment Routing with IPv6
Jan 20 08:16:02 np0005588919 kernel: NET: Registered PF_PACKET protocol family
Jan 20 08:16:02 np0005588919 kernel: mpls_gso: MPLS GSO support
Jan 20 08:16:02 np0005588919 kernel: IPI shorthand broadcast: enabled
Jan 20 08:16:02 np0005588919 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 20 08:16:02 np0005588919 kernel: AES CTR mode by8 optimization enabled
Jan 20 08:16:02 np0005588919 kernel: sched_clock: Marking stable (1657006660, 145822700)->(1876534820, -73705460)
Jan 20 08:16:02 np0005588919 kernel: registered taskstats version 1
Jan 20 08:16:02 np0005588919 kernel: Loading compiled-in X.509 certificates
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 20 08:16:02 np0005588919 kernel: Demotion targets for Node 0: null
Jan 20 08:16:02 np0005588919 kernel: page_owner is disabled
Jan 20 08:16:02 np0005588919 kernel: Key type .fscrypt registered
Jan 20 08:16:02 np0005588919 kernel: Key type fscrypt-provisioning registered
Jan 20 08:16:02 np0005588919 kernel: Key type big_key registered
Jan 20 08:16:02 np0005588919 kernel: Key type encrypted registered
Jan 20 08:16:02 np0005588919 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 20 08:16:02 np0005588919 kernel: Loading compiled-in module X.509 certificates
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 20 08:16:02 np0005588919 kernel: ima: Allocated hash algorithm: sha256
Jan 20 08:16:02 np0005588919 kernel: ima: No architecture policies found
Jan 20 08:16:02 np0005588919 kernel: evm: Initialising EVM extended attributes:
Jan 20 08:16:02 np0005588919 kernel: evm: security.selinux
Jan 20 08:16:02 np0005588919 kernel: evm: security.SMACK64 (disabled)
Jan 20 08:16:02 np0005588919 kernel: evm: security.SMACK64EXEC (disabled)
Jan 20 08:16:02 np0005588919 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 20 08:16:02 np0005588919 kernel: evm: security.SMACK64MMAP (disabled)
Jan 20 08:16:02 np0005588919 kernel: evm: security.apparmor (disabled)
Jan 20 08:16:02 np0005588919 kernel: evm: security.ima
Jan 20 08:16:02 np0005588919 kernel: evm: security.capability
Jan 20 08:16:02 np0005588919 kernel: evm: HMAC attrs: 0x1
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 20 08:16:02 np0005588919 kernel: Running certificate verification RSA selftest
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 20 08:16:02 np0005588919 kernel: Running certificate verification ECDSA selftest
Jan 20 08:16:02 np0005588919 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 20 08:16:02 np0005588919 kernel: clk: Disabling unused clocks
Jan 20 08:16:02 np0005588919 kernel: Freeing unused decrypted memory: 2028K
Jan 20 08:16:02 np0005588919 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 20 08:16:02 np0005588919 kernel: Write protecting the kernel read-only data: 30720k
Jan 20 08:16:02 np0005588919 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 20 08:16:02 np0005588919 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 20 08:16:02 np0005588919 kernel: Run /init as init process
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: Manufacturer: QEMU
Jan 20 08:16:02 np0005588919 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 20 08:16:02 np0005588919 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 20 08:16:02 np0005588919 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 20 08:16:02 np0005588919 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 20 08:16:02 np0005588919 systemd: Detected virtualization kvm.
Jan 20 08:16:02 np0005588919 systemd: Detected architecture x86-64.
Jan 20 08:16:02 np0005588919 systemd: Running in initrd.
Jan 20 08:16:02 np0005588919 systemd: No hostname configured, using default hostname.
Jan 20 08:16:02 np0005588919 systemd: Hostname set to <localhost>.
Jan 20 08:16:02 np0005588919 systemd: Initializing machine ID from VM UUID.
Jan 20 08:16:02 np0005588919 systemd: Queued start job for default target Initrd Default Target.
Jan 20 08:16:02 np0005588919 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:02 np0005588919 systemd: Reached target Local Encrypted Volumes.
Jan 20 08:16:02 np0005588919 systemd: Reached target Initrd /usr File System.
Jan 20 08:16:02 np0005588919 systemd: Reached target Local File Systems.
Jan 20 08:16:02 np0005588919 systemd: Reached target Path Units.
Jan 20 08:16:02 np0005588919 systemd: Reached target Slice Units.
Jan 20 08:16:02 np0005588919 systemd: Reached target Swaps.
Jan 20 08:16:02 np0005588919 systemd: Reached target Timer Units.
Jan 20 08:16:02 np0005588919 systemd: Listening on D-Bus System Message Bus Socket.
Jan 20 08:16:02 np0005588919 systemd: Listening on Journal Socket (/dev/log).
Jan 20 08:16:02 np0005588919 systemd: Listening on Journal Socket.
Jan 20 08:16:02 np0005588919 systemd: Listening on udev Control Socket.
Jan 20 08:16:02 np0005588919 systemd: Listening on udev Kernel Socket.
Jan 20 08:16:02 np0005588919 systemd: Reached target Socket Units.
Jan 20 08:16:02 np0005588919 systemd: Starting Create List of Static Device Nodes...
Jan 20 08:16:02 np0005588919 systemd: Starting Journal Service...
Jan 20 08:16:02 np0005588919 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 20 08:16:02 np0005588919 systemd: Starting Apply Kernel Variables...
Jan 20 08:16:02 np0005588919 systemd: Starting Create System Users...
Jan 20 08:16:02 np0005588919 systemd: Starting Setup Virtual Console...
Jan 20 08:16:02 np0005588919 systemd: Finished Create List of Static Device Nodes.
Jan 20 08:16:02 np0005588919 systemd: Finished Apply Kernel Variables.
Jan 20 08:16:02 np0005588919 systemd-journald[302]: Journal started
Jan 20 08:16:02 np0005588919 systemd-journald[302]: Runtime Journal (/run/log/journal/870b1f1cf19c477bb282ee6eeba50974) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:02 np0005588919 systemd-sysusers[307]: Creating group 'users' with GID 100.
Jan 20 08:16:02 np0005588919 systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Jan 20 08:16:02 np0005588919 systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 20 08:16:02 np0005588919 systemd: Finished Create System Users.
Jan 20 08:16:02 np0005588919 systemd: Started Journal Service.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 20 08:16:02 np0005588919 systemd[1]: Starting Create Volatile Files and Directories...
Jan 20 08:16:02 np0005588919 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 20 08:16:02 np0005588919 systemd[1]: Finished Create Volatile Files and Directories.
Jan 20 08:16:02 np0005588919 systemd[1]: Finished Setup Virtual Console.
Jan 20 08:16:02 np0005588919 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting dracut cmdline hook...
Jan 20 08:16:02 np0005588919 dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Jan 20 08:16:02 np0005588919 dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 20 08:16:02 np0005588919 systemd[1]: Finished dracut cmdline hook.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting dracut pre-udev hook...
Jan 20 08:16:02 np0005588919 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 20 08:16:02 np0005588919 kernel: device-mapper: uevent: version 1.0.3
Jan 20 08:16:02 np0005588919 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 20 08:16:02 np0005588919 kernel: RPC: Registered named UNIX socket transport module.
Jan 20 08:16:02 np0005588919 kernel: RPC: Registered udp transport module.
Jan 20 08:16:02 np0005588919 kernel: RPC: Registered tcp transport module.
Jan 20 08:16:02 np0005588919 kernel: RPC: Registered tcp-with-tls transport module.
Jan 20 08:16:02 np0005588919 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 20 08:16:02 np0005588919 rpc.statd[441]: Version 2.5.4 starting
Jan 20 08:16:02 np0005588919 rpc.statd[441]: Initializing NSM state
Jan 20 08:16:02 np0005588919 rpc.idmapd[446]: Setting log level to 0
Jan 20 08:16:02 np0005588919 systemd[1]: Finished dracut pre-udev hook.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 20 08:16:02 np0005588919 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 20 08:16:02 np0005588919 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting dracut pre-trigger hook...
Jan 20 08:16:02 np0005588919 systemd[1]: Finished dracut pre-trigger hook.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting Coldplug All udev Devices...
Jan 20 08:16:02 np0005588919 systemd[1]: Created slice Slice /system/modprobe.
Jan 20 08:16:02 np0005588919 systemd[1]: Starting Load Kernel Module configfs...
Jan 20 08:16:02 np0005588919 systemd[1]: Finished Coldplug All udev Devices.
Jan 20 08:16:02 np0005588919 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:02 np0005588919 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:02 np0005588919 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 20 08:16:02 np0005588919 systemd[1]: Reached target Network.
Jan 20 08:16:02 np0005588919 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 20 08:16:02 np0005588919 systemd[1]: Starting dracut initqueue hook...
Jan 20 08:16:03 np0005588919 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 20 08:16:03 np0005588919 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 20 08:16:03 np0005588919 kernel: vda: vda1
Jan 20 08:16:03 np0005588919 systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:16:03 np0005588919 kernel: scsi host0: ata_piix
Jan 20 08:16:03 np0005588919 kernel: scsi host1: ata_piix
Jan 20 08:16:03 np0005588919 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 20 08:16:03 np0005588919 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 20 08:16:03 np0005588919 systemd[1]: Mounting Kernel Configuration File System...
Jan 20 08:16:03 np0005588919 systemd[1]: Mounted Kernel Configuration File System.
Jan 20 08:16:03 np0005588919 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Initrd Root Device.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target System Initialization.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Basic System.
Jan 20 08:16:03 np0005588919 kernel: ata1: found unknown device (class 0)
Jan 20 08:16:03 np0005588919 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 20 08:16:03 np0005588919 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 20 08:16:03 np0005588919 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 20 08:16:03 np0005588919 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 20 08:16:03 np0005588919 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 20 08:16:03 np0005588919 systemd[1]: Finished dracut initqueue hook.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Remote File Systems.
Jan 20 08:16:03 np0005588919 systemd[1]: Starting dracut pre-mount hook...
Jan 20 08:16:03 np0005588919 systemd[1]: Finished dracut pre-mount hook.
Jan 20 08:16:03 np0005588919 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 20 08:16:03 np0005588919 systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Jan 20 08:16:03 np0005588919 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 20 08:16:03 np0005588919 systemd[1]: Mounting /sysroot...
Jan 20 08:16:03 np0005588919 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 20 08:16:03 np0005588919 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 20 08:16:03 np0005588919 kernel: XFS (vda1): Ending clean mount
Jan 20 08:16:03 np0005588919 systemd[1]: Mounted /sysroot.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Initrd Root File System.
Jan 20 08:16:03 np0005588919 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 20 08:16:03 np0005588919 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 20 08:16:03 np0005588919 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Initrd File Systems.
Jan 20 08:16:03 np0005588919 systemd[1]: Reached target Initrd Default Target.
Jan 20 08:16:03 np0005588919 systemd[1]: Starting dracut mount hook...
Jan 20 08:16:04 np0005588919 systemd[1]: Finished dracut mount hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 20 08:16:04 np0005588919 rpc.idmapd[446]: exiting on signal 15
Jan 20 08:16:04 np0005588919 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Network.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Timer Units.
Jan 20 08:16:04 np0005588919 systemd[1]: dbus.socket: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Initrd Default Target.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Basic System.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Initrd Root Device.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Initrd /usr File System.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Path Units.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Remote File Systems.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Slice Units.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Socket Units.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target System Initialization.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Local File Systems.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Swaps.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut mount hook.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut pre-mount hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut initqueue hook.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Apply Kernel Variables.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Coldplug All udev Devices.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut pre-trigger hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Setup Virtual Console.
Jan 20 08:16:04 np0005588919 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Closed udev Control Socket.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Closed udev Kernel Socket.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut pre-udev hook.
Jan 20 08:16:04 np0005588919 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped dracut cmdline hook.
Jan 20 08:16:04 np0005588919 systemd[1]: Starting Cleanup udev Database...
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 20 08:16:04 np0005588919 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Stopped Create System Users.
Jan 20 08:16:04 np0005588919 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Cleanup udev Database.
Jan 20 08:16:04 np0005588919 systemd[1]: Reached target Switch Root.
Jan 20 08:16:04 np0005588919 systemd[1]: Starting Switch Root...
Jan 20 08:16:04 np0005588919 systemd[1]: Switching root.
Jan 20 08:16:04 np0005588919 systemd-journald[302]: Journal stopped
Jan 20 08:16:04 np0005588919 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 20 08:16:04 np0005588919 kernel: audit: type=1404 audit(1768914964.358:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:16:04 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:16:04 np0005588919 kernel: audit: type=1403 audit(1768914964.491:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 20 08:16:04 np0005588919 systemd: Successfully loaded SELinux policy in 136.739ms.
Jan 20 08:16:04 np0005588919 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.750ms.
Jan 20 08:16:04 np0005588919 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 20 08:16:04 np0005588919 systemd: Detected virtualization kvm.
Jan 20 08:16:04 np0005588919 systemd: Detected architecture x86-64.
Jan 20 08:16:04 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:16:04 np0005588919 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd: Stopped Switch Root.
Jan 20 08:16:04 np0005588919 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 20 08:16:04 np0005588919 systemd: Created slice Slice /system/getty.
Jan 20 08:16:04 np0005588919 systemd: Created slice Slice /system/serial-getty.
Jan 20 08:16:04 np0005588919 systemd: Created slice Slice /system/sshd-keygen.
Jan 20 08:16:04 np0005588919 systemd: Created slice User and Session Slice.
Jan 20 08:16:04 np0005588919 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 20 08:16:04 np0005588919 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 20 08:16:04 np0005588919 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 20 08:16:04 np0005588919 systemd: Reached target Local Encrypted Volumes.
Jan 20 08:16:04 np0005588919 systemd: Stopped target Switch Root.
Jan 20 08:16:04 np0005588919 systemd: Stopped target Initrd File Systems.
Jan 20 08:16:04 np0005588919 systemd: Stopped target Initrd Root File System.
Jan 20 08:16:04 np0005588919 systemd: Reached target Local Integrity Protected Volumes.
Jan 20 08:16:04 np0005588919 systemd: Reached target Path Units.
Jan 20 08:16:04 np0005588919 systemd: Reached target rpc_pipefs.target.
Jan 20 08:16:04 np0005588919 systemd: Reached target Slice Units.
Jan 20 08:16:04 np0005588919 systemd: Reached target Swaps.
Jan 20 08:16:04 np0005588919 systemd: Reached target Local Verity Protected Volumes.
Jan 20 08:16:04 np0005588919 systemd: Listening on RPCbind Server Activation Socket.
Jan 20 08:16:04 np0005588919 systemd: Reached target RPC Port Mapper.
Jan 20 08:16:04 np0005588919 systemd: Listening on Process Core Dump Socket.
Jan 20 08:16:04 np0005588919 systemd: Listening on initctl Compatibility Named Pipe.
Jan 20 08:16:04 np0005588919 systemd: Listening on udev Control Socket.
Jan 20 08:16:04 np0005588919 systemd: Listening on udev Kernel Socket.
Jan 20 08:16:04 np0005588919 systemd: Mounting Huge Pages File System...
Jan 20 08:16:04 np0005588919 systemd: Mounting POSIX Message Queue File System...
Jan 20 08:16:04 np0005588919 systemd: Mounting Kernel Debug File System...
Jan 20 08:16:04 np0005588919 systemd: Mounting Kernel Trace File System...
Jan 20 08:16:04 np0005588919 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 20 08:16:04 np0005588919 systemd: Starting Create List of Static Device Nodes...
Jan 20 08:16:04 np0005588919 systemd: Starting Load Kernel Module configfs...
Jan 20 08:16:04 np0005588919 systemd: Starting Load Kernel Module drm...
Jan 20 08:16:04 np0005588919 systemd: Starting Load Kernel Module efi_pstore...
Jan 20 08:16:04 np0005588919 systemd: Starting Load Kernel Module fuse...
Jan 20 08:16:04 np0005588919 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 20 08:16:04 np0005588919 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd: Stopped File System Check on Root Device.
Jan 20 08:16:04 np0005588919 systemd: Stopped Journal Service.
Jan 20 08:16:04 np0005588919 systemd: Starting Journal Service...
Jan 20 08:16:04 np0005588919 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 20 08:16:04 np0005588919 systemd: Starting Generate network units from Kernel command line...
Jan 20 08:16:04 np0005588919 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:04 np0005588919 systemd: Starting Remount Root and Kernel File Systems...
Jan 20 08:16:04 np0005588919 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 20 08:16:04 np0005588919 systemd: Starting Apply Kernel Variables...
Jan 20 08:16:04 np0005588919 kernel: fuse: init (API version 7.37)
Jan 20 08:16:04 np0005588919 systemd: Starting Coldplug All udev Devices...
Jan 20 08:16:04 np0005588919 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 20 08:16:04 np0005588919 systemd: Mounted Huge Pages File System.
Jan 20 08:16:04 np0005588919 systemd: Mounted POSIX Message Queue File System.
Jan 20 08:16:04 np0005588919 systemd: Mounted Kernel Debug File System.
Jan 20 08:16:04 np0005588919 systemd-journald[675]: Journal started
Jan 20 08:16:04 np0005588919 systemd-journald[675]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:04 np0005588919 systemd[1]: Queued start job for default target Multi-User System.
Jan 20 08:16:04 np0005588919 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd: Started Journal Service.
Jan 20 08:16:04 np0005588919 systemd[1]: Mounted Kernel Trace File System.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Create List of Static Device Nodes.
Jan 20 08:16:04 np0005588919 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:04 np0005588919 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 20 08:16:04 np0005588919 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Load Kernel Module fuse.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Generate network units from Kernel command line.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 20 08:16:04 np0005588919 systemd[1]: Finished Apply Kernel Variables.
Jan 20 08:16:05 np0005588919 systemd[1]: Mounting FUSE Control File System...
Jan 20 08:16:05 np0005588919 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Rebuild Hardware Database...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 20 08:16:05 np0005588919 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Load/Save OS Random Seed...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Create System Users...
Jan 20 08:16:05 np0005588919 systemd[1]: Mounted FUSE Control File System.
Jan 20 08:16:05 np0005588919 systemd-journald[675]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 20 08:16:05 np0005588919 systemd-journald[675]: Received client request to flush runtime journal.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Load/Save OS Random Seed.
Jan 20 08:16:05 np0005588919 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 20 08:16:05 np0005588919 kernel: ACPI: bus type drm_connector registered
Jan 20 08:16:05 np0005588919 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Load Kernel Module drm.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Create System Users.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Coldplug All udev Devices.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target Preparation for Local File Systems.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target Local File Systems.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 20 08:16:05 np0005588919 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 20 08:16:05 np0005588919 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 20 08:16:05 np0005588919 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Automatic Boot Loader Update...
Jan 20 08:16:05 np0005588919 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Create Volatile Files and Directories...
Jan 20 08:16:05 np0005588919 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Automatic Boot Loader Update.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Create Volatile Files and Directories.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Security Auditing Service...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting RPC Bind...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Rebuild Journal Catalog...
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 20 08:16:05 np0005588919 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 20 08:16:05 np0005588919 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 20 08:16:05 np0005588919 systemd[1]: Started RPC Bind.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Rebuild Journal Catalog.
Jan 20 08:16:05 np0005588919 augenrules[706]: /sbin/augenrules: No change
Jan 20 08:16:05 np0005588919 augenrules[721]: No rules
Jan 20 08:16:05 np0005588919 augenrules[721]: enabled 1
Jan 20 08:16:05 np0005588919 augenrules[721]: failure 1
Jan 20 08:16:05 np0005588919 augenrules[721]: pid 701
Jan 20 08:16:05 np0005588919 augenrules[721]: rate_limit 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_limit 8192
Jan 20 08:16:05 np0005588919 augenrules[721]: lost 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time 60000
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time_actual 0
Jan 20 08:16:05 np0005588919 augenrules[721]: enabled 1
Jan 20 08:16:05 np0005588919 augenrules[721]: failure 1
Jan 20 08:16:05 np0005588919 augenrules[721]: pid 701
Jan 20 08:16:05 np0005588919 augenrules[721]: rate_limit 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_limit 8192
Jan 20 08:16:05 np0005588919 augenrules[721]: lost 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog 4
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time 60000
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time_actual 0
Jan 20 08:16:05 np0005588919 augenrules[721]: enabled 1
Jan 20 08:16:05 np0005588919 augenrules[721]: failure 1
Jan 20 08:16:05 np0005588919 augenrules[721]: pid 701
Jan 20 08:16:05 np0005588919 augenrules[721]: rate_limit 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_limit 8192
Jan 20 08:16:05 np0005588919 augenrules[721]: lost 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog 0
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time 60000
Jan 20 08:16:05 np0005588919 augenrules[721]: backlog_wait_time_actual 0
Jan 20 08:16:05 np0005588919 systemd[1]: Started Security Auditing Service.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Rebuild Hardware Database.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Update is Completed...
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Update is Completed.
Jan 20 08:16:05 np0005588919 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 20 08:16:05 np0005588919 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target System Initialization.
Jan 20 08:16:05 np0005588919 systemd[1]: Started dnf makecache --timer.
Jan 20 08:16:05 np0005588919 systemd[1]: Started Daily rotation of log files.
Jan 20 08:16:05 np0005588919 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target Timer Units.
Jan 20 08:16:05 np0005588919 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 20 08:16:05 np0005588919 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target Socket Units.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting D-Bus System Message Bus...
Jan 20 08:16:05 np0005588919 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:05 np0005588919 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Load Kernel Module configfs...
Jan 20 08:16:05 np0005588919 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:16:05 np0005588919 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Load Kernel Module configfs.
Jan 20 08:16:05 np0005588919 systemd[1]: Started D-Bus System Message Bus.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target Basic System.
Jan 20 08:16:05 np0005588919 dbus-broker-lau[761]: Ready
Jan 20 08:16:05 np0005588919 systemd[1]: Starting NTP client/server...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 20 08:16:05 np0005588919 systemd[1]: Starting IPv4 firewall with iptables...
Jan 20 08:16:05 np0005588919 systemd[1]: Started irqbalance daemon.
Jan 20 08:16:05 np0005588919 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 20 08:16:05 np0005588919 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:05 np0005588919 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:05 np0005588919 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target sshd-keygen.target.
Jan 20 08:16:05 np0005588919 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 20 08:16:05 np0005588919 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 20 08:16:05 np0005588919 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 20 08:16:05 np0005588919 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 20 08:16:05 np0005588919 systemd[1]: Reached target User and Group Name Lookups.
Jan 20 08:16:05 np0005588919 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 20 08:16:05 np0005588919 systemd[1]: Starting User Login Management...
Jan 20 08:16:05 np0005588919 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 20 08:16:05 np0005588919 chronyd[796]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 20 08:16:05 np0005588919 chronyd[796]: Loaded 0 symmetric keys
Jan 20 08:16:05 np0005588919 chronyd[796]: Using right/UTC timezone to obtain leap second data
Jan 20 08:16:05 np0005588919 chronyd[796]: Loaded seccomp filter (level 2)
Jan 20 08:16:05 np0005588919 systemd[1]: Started NTP client/server.
Jan 20 08:16:05 np0005588919 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 20 08:16:05 np0005588919 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 20 08:16:05 np0005588919 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 20 08:16:05 np0005588919 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 20 08:16:05 np0005588919 kernel: Console: switching to colour dummy device 80x25
Jan 20 08:16:05 np0005588919 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 20 08:16:05 np0005588919 kernel: [drm] features: -context_init
Jan 20 08:16:05 np0005588919 systemd-logind[783]: New seat seat0.
Jan 20 08:16:05 np0005588919 systemd[1]: Started User Login Management.
Jan 20 08:16:05 np0005588919 kernel: [drm] number of scanouts: 1
Jan 20 08:16:05 np0005588919 kernel: [drm] number of cap sets: 0
Jan 20 08:16:05 np0005588919 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 20 08:16:05 np0005588919 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 20 08:16:05 np0005588919 kernel: Console: switching to colour frame buffer device 128x48
Jan 20 08:16:05 np0005588919 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 20 08:16:05 np0005588919 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 20 08:16:05 np0005588919 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 20 08:16:05 np0005588919 kernel: kvm_amd: TSC scaling supported
Jan 20 08:16:05 np0005588919 kernel: kvm_amd: Nested Virtualization enabled
Jan 20 08:16:05 np0005588919 kernel: kvm_amd: Nested Paging enabled
Jan 20 08:16:05 np0005588919 kernel: kvm_amd: LBR virtualization supported
Jan 20 08:16:06 np0005588919 iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Jan 20 08:16:06 np0005588919 systemd[1]: Finished IPv4 firewall with iptables.
Jan 20 08:16:06 np0005588919 cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Tue, 20 Jan 2026 13:16:06 +0000. Up 6.36 seconds.
Jan 20 08:16:06 np0005588919 systemd[1]: run-cloud\x2dinit-tmp-tmpqx1mrguw.mount: Deactivated successfully.
Jan 20 08:16:06 np0005588919 systemd[1]: Starting Hostname Service...
Jan 20 08:16:06 np0005588919 systemd[1]: Started Hostname Service.
Jan 20 08:16:06 np0005588919 systemd-hostnamed[852]: Hostname set to <np0005588919.novalocal> (static)
Jan 20 08:16:06 np0005588919 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 20 08:16:06 np0005588919 systemd[1]: Reached target Preparation for Network.
Jan 20 08:16:06 np0005588919 systemd[1]: Starting Network Manager...
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0272] NetworkManager (version 1.54.3-2.el9) is starting... (boot:017a3b90-38ab-4863-8e66-991c4844fcc7)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0278] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0354] manager[0x55ec69009000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0391] hostname: hostname: using hostnamed
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0392] hostname: static hostname changed from (none) to "np0005588919.novalocal"
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0396] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0496] manager[0x55ec69009000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0497] manager[0x55ec69009000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:16:07 np0005588919 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0546] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0548] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0549] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0550] manager: Networking is enabled by state file
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0552] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0567] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0595] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0608] dhcp: init: Using DHCP client 'internal'
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0611] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0628] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0638] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0650] device (lo): Activation: starting connection 'lo' (f1b29bda-3a6a-4be0-8c9c-6df9359cf4c4)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0661] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0666] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0724] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0730] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0735] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0739] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0743] device (eth0): carrier: link connected
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0749] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0759] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0767] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:16:07 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0776] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0777] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0782] manager: NetworkManager state is now CONNECTING
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0785] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0794] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 systemd[1]: Started Network Manager.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0802] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:16:07 np0005588919 systemd[1]: Reached target Network.
Jan 20 08:16:07 np0005588919 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0848] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0858] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:16:07 np0005588919 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0881] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0996] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.0998] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1003] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1013] device (lo): Activation: successful, device activated.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1021] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1027] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1031] device (eth0): Activation: successful, device activated.
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1039] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:16:07 np0005588919 NetworkManager[856]: <info>  [1768914967.1046] manager: startup complete
Jan 20 08:16:07 np0005588919 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 20 08:16:07 np0005588919 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 20 08:16:07 np0005588919 systemd[1]: Reached target NFS client services.
Jan 20 08:16:07 np0005588919 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 20 08:16:07 np0005588919 systemd[1]: Reached target Remote File Systems.
Jan 20 08:16:07 np0005588919 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 20 08:16:07 np0005588919 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:16:07 np0005588919 systemd[1]: Starting Cloud-init: Network Stage...
Jan 20 08:16:07 np0005588919 cloud-init[916]: Cloud-init v. 24.4-8.el9 running 'init' at Tue, 20 Jan 2026 13:16:07 +0000. Up 7.54 seconds.
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |  eth0  | True |        38.102.83.169         | 255.255.255.0 | global | fa:16:3e:b5:2a:39 |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |  eth0  | True | fe80::f816:3eff:feb5:2a39/64 |       .       |  link  | fa:16:3e:b5:2a:39 |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 20 08:16:07 np0005588919 cloud-init[916]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 20 08:16:11 np0005588919 cloud-init[916]: Generating public/private rsa key pair.
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key fingerprint is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: SHA256:oqb1FipA3QDYdW0c9PWThFmXe/VtUdIEP0u5rZRnI7M root@np0005588919.novalocal
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key's randomart image is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: +---[RSA 3072]----+
Jan 20 08:16:11 np0005588919 cloud-init[916]: |.o... .+o.  .+=+*|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |. .. .  +. .oo B+|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |  . o  .  .   ++B|
Jan 20 08:16:11 np0005588919 cloud-init[916]: | . . .        .+X|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |.     . S    oo==|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |.    ...     .++.|
Jan 20 08:16:11 np0005588919 cloud-init[916]: | .  +. .     E.  |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |  .+...          |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |  .. ..          |
Jan 20 08:16:11 np0005588919 cloud-init[916]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588919 cloud-init[916]: Generating public/private ecdsa key pair.
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key fingerprint is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: SHA256:bF791DUbc28TlJd8ya+nF4sItgSmR/HZqSG3bZ7vYoI root@np0005588919.novalocal
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key's randomart image is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: +---[ECDSA 256]---+
Jan 20 08:16:11 np0005588919 cloud-init[916]: |              o.+|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |       .      .*o|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |        o o .  =*|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |       * = +   .@|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |      + S * . .++|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |     . + B o o.oo|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |      . = = o ooo|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |       E o * ....|
Jan 20 08:16:11 np0005588919 cloud-init[916]: |          o +o . |
Jan 20 08:16:11 np0005588919 cloud-init[916]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588919 cloud-init[916]: Generating public/private ed25519 key pair.
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 20 08:16:11 np0005588919 cloud-init[916]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key fingerprint is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: SHA256:Fs+VCqBtNhnhalplMMEUB65JjqLKHDp0zW5usKnNHDA root@np0005588919.novalocal
Jan 20 08:16:11 np0005588919 cloud-init[916]: The key's randomart image is:
Jan 20 08:16:11 np0005588919 cloud-init[916]: +--[ED25519 256]--+
Jan 20 08:16:11 np0005588919 cloud-init[916]: |   oB+=.         |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |   ..B +     .   |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |  . o X o   o    |
Jan 20 08:16:11 np0005588919 cloud-init[916]: | + o * . = o     |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |oE+ *   S +      |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |o.o* o .         |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |o.o.=            |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |* =o.+           |
Jan 20 08:16:11 np0005588919 cloud-init[916]: |o=.++.           |
Jan 20 08:16:11 np0005588919 cloud-init[916]: +----[SHA256]-----+
Jan 20 08:16:11 np0005588919 systemd[1]: Finished Cloud-init: Network Stage.
Jan 20 08:16:11 np0005588919 systemd[1]: Reached target Cloud-config availability.
Jan 20 08:16:11 np0005588919 systemd[1]: Reached target Network is Online.
Jan 20 08:16:11 np0005588919 systemd[1]: Starting Cloud-init: Config Stage...
Jan 20 08:16:11 np0005588919 systemd[1]: Starting Crash recovery kernel arming...
Jan 20 08:16:11 np0005588919 systemd[1]: Starting Notify NFS peers of a restart...
Jan 20 08:16:11 np0005588919 systemd[1]: Starting System Logging Service...
Jan 20 08:16:11 np0005588919 sm-notify[1001]: Version 2.5.4 starting
Jan 20 08:16:11 np0005588919 systemd[1]: Starting OpenSSH server daemon...
Jan 20 08:16:11 np0005588919 systemd[1]: Starting Permit User Sessions...
Jan 20 08:16:11 np0005588919 systemd[1]: Started Notify NFS peers of a restart.
Jan 20 08:16:11 np0005588919 systemd[1]: Started OpenSSH server daemon.
Jan 20 08:16:11 np0005588919 systemd[1]: Finished Permit User Sessions.
Jan 20 08:16:11 np0005588919 systemd[1]: Started Command Scheduler.
Jan 20 08:16:11 np0005588919 systemd[1]: Started Getty on tty1.
Jan 20 08:16:11 np0005588919 systemd[1]: Started Serial Getty on ttyS0.
Jan 20 08:16:11 np0005588919 systemd[1]: Reached target Login Prompts.
Jan 20 08:16:11 np0005588919 rsyslogd[1002]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1002" x-info="https://www.rsyslog.com"] start
Jan 20 08:16:11 np0005588919 systemd[1]: Started System Logging Service.
Jan 20 08:16:11 np0005588919 rsyslogd[1002]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 20 08:16:11 np0005588919 systemd[1]: Reached target Multi-User System.
Jan 20 08:16:11 np0005588919 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 20 08:16:11 np0005588919 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 20 08:16:11 np0005588919 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 20 08:16:11 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:16:11 np0005588919 kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Jan 20 08:16:11 np0005588919 kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 20 08:16:12 np0005588919 cloud-init[1126]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Tue, 20 Jan 2026 13:16:11 +0000. Up 12.06 seconds.
Jan 20 08:16:12 np0005588919 systemd[1]: Finished Cloud-init: Config Stage.
Jan 20 08:16:12 np0005588919 systemd[1]: Starting Cloud-init: Final Stage...
Jan 20 08:16:12 np0005588919 dracut[1262]: dracut-057-102.git20250818.el9
Jan 20 08:16:12 np0005588919 cloud-init[1284]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Tue, 20 Jan 2026 13:16:12 +0000. Up 12.49 seconds.
Jan 20 08:16:12 np0005588919 cloud-init[1298]: #############################################################
Jan 20 08:16:12 np0005588919 cloud-init[1301]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 20 08:16:12 np0005588919 dracut[1264]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 20 08:16:12 np0005588919 cloud-init[1309]: 256 SHA256:bF791DUbc28TlJd8ya+nF4sItgSmR/HZqSG3bZ7vYoI root@np0005588919.novalocal (ECDSA)
Jan 20 08:16:12 np0005588919 cloud-init[1316]: 256 SHA256:Fs+VCqBtNhnhalplMMEUB65JjqLKHDp0zW5usKnNHDA root@np0005588919.novalocal (ED25519)
Jan 20 08:16:12 np0005588919 cloud-init[1323]: 3072 SHA256:oqb1FipA3QDYdW0c9PWThFmXe/VtUdIEP0u5rZRnI7M root@np0005588919.novalocal (RSA)
Jan 20 08:16:12 np0005588919 cloud-init[1325]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 20 08:16:12 np0005588919 cloud-init[1327]: #############################################################
Jan 20 08:16:12 np0005588919 cloud-init[1284]: Cloud-init v. 24.4-8.el9 finished at Tue, 20 Jan 2026 13:16:12 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.66 seconds
Jan 20 08:16:13 np0005588919 chronyd[796]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 20 08:16:13 np0005588919 chronyd[796]: System clock wrong by 1.122361 seconds
Jan 20 08:16:13 np0005588919 chronyd[796]: System clock was stepped by 1.122361 seconds
Jan 20 08:16:13 np0005588919 chronyd[796]: System clock TAI offset set to 37 seconds
Jan 20 08:16:13 np0005588919 systemd[1]: Finished Cloud-init: Final Stage.
Jan 20 08:16:13 np0005588919 systemd[1]: Reached target Cloud-init target.
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: memstrack is not available
Jan 20 08:16:14 np0005588919 dracut[1264]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 20 08:16:14 np0005588919 dracut[1264]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 20 08:16:15 np0005588919 dracut[1264]: memstrack is not available
Jan 20 08:16:15 np0005588919 dracut[1264]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 20 08:16:15 np0005588919 dracut[1264]: *** Including module: systemd ***
Jan 20 08:16:15 np0005588919 dracut[1264]: *** Including module: fips ***
Jan 20 08:16:15 np0005588919 dracut[1264]: *** Including module: systemd-initrd ***
Jan 20 08:16:15 np0005588919 dracut[1264]: *** Including module: i18n ***
Jan 20 08:16:16 np0005588919 dracut[1264]: *** Including module: drm ***
Jan 20 08:16:16 np0005588919 dracut[1264]: *** Including module: prefixdevname ***
Jan 20 08:16:16 np0005588919 dracut[1264]: *** Including module: kernel-modules ***
Jan 20 08:16:16 np0005588919 kernel: block vda: the capability attribute has been deprecated.
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 25 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 31 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 28 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 32 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 30 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 irqbalance[778]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 20 08:16:17 np0005588919 irqbalance[778]: IRQ 29 affinity is now unmanaged
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: kernel-modules-extra ***
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: qemu ***
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: fstab-sys ***
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: rootfs-block ***
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: terminfo ***
Jan 20 08:16:17 np0005588919 dracut[1264]: *** Including module: udev-rules ***
Jan 20 08:16:18 np0005588919 dracut[1264]: Skipping udev rule: 91-permissions.rules
Jan 20 08:16:18 np0005588919 dracut[1264]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: virtiofs ***
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: dracut-systemd ***
Jan 20 08:16:18 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: usrmount ***
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: base ***
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: fs-lib ***
Jan 20 08:16:18 np0005588919 dracut[1264]: *** Including module: kdumpbase ***
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 20 08:16:19 np0005588919 dracut[1264]:  microcode_ctl module: mangling fw_dir
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 20 08:16:19 np0005588919 dracut[1264]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Including module: openssl ***
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Including module: shutdown ***
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Including module: squash ***
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Including modules done ***
Jan 20 08:16:19 np0005588919 dracut[1264]: *** Installing kernel module dependencies ***
Jan 20 08:16:20 np0005588919 dracut[1264]: *** Installing kernel module dependencies done ***
Jan 20 08:16:20 np0005588919 dracut[1264]: *** Resolving executable dependencies ***
Jan 20 08:16:21 np0005588919 dracut[1264]: *** Resolving executable dependencies done ***
Jan 20 08:16:22 np0005588919 dracut[1264]: *** Generating early-microcode cpio image ***
Jan 20 08:16:22 np0005588919 dracut[1264]: *** Store current command line parameters ***
Jan 20 08:16:22 np0005588919 dracut[1264]: Stored kernel commandline:
Jan 20 08:16:22 np0005588919 dracut[1264]: No dracut internal kernel commandline stored in the initramfs
Jan 20 08:16:22 np0005588919 dracut[1264]: *** Install squash loader ***
Jan 20 08:16:23 np0005588919 dracut[1264]: *** Squashing the files inside the initramfs ***
Jan 20 08:16:24 np0005588919 dracut[1264]: *** Squashing the files inside the initramfs done ***
Jan 20 08:16:24 np0005588919 dracut[1264]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 20 08:16:24 np0005588919 dracut[1264]: *** Hardlinking files ***
Jan 20 08:16:24 np0005588919 dracut[1264]: *** Hardlinking files done ***
Jan 20 08:16:24 np0005588919 dracut[1264]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 20 08:16:25 np0005588919 kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Jan 20 08:16:25 np0005588919 kdumpctl[1015]: kdump: Starting kdump: [OK]
Jan 20 08:16:25 np0005588919 systemd[1]: Finished Crash recovery kernel arming.
Jan 20 08:16:25 np0005588919 systemd[1]: Startup finished in 1.990s (kernel) + 2.471s (initrd) + 20.223s (userspace) = 24.686s.
Jan 20 08:16:35 np0005588919 systemd[1]: Created slice User Slice of UID 1000.
Jan 20 08:16:35 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 20 08:16:35 np0005588919 systemd-logind[783]: New session 1 of user zuul.
Jan 20 08:16:35 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 20 08:16:35 np0005588919 systemd[1]: Starting User Manager for UID 1000...
Jan 20 08:16:35 np0005588919 systemd[4302]: Queued start job for default target Main User Target.
Jan 20 08:16:35 np0005588919 systemd[4302]: Created slice User Application Slice.
Jan 20 08:16:35 np0005588919 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 08:16:35 np0005588919 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 08:16:35 np0005588919 systemd[4302]: Reached target Paths.
Jan 20 08:16:35 np0005588919 systemd[4302]: Reached target Timers.
Jan 20 08:16:35 np0005588919 systemd[4302]: Starting D-Bus User Message Bus Socket...
Jan 20 08:16:35 np0005588919 systemd[4302]: Starting Create User's Volatile Files and Directories...
Jan 20 08:16:35 np0005588919 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Jan 20 08:16:35 np0005588919 systemd[4302]: Reached target Sockets.
Jan 20 08:16:35 np0005588919 systemd[4302]: Finished Create User's Volatile Files and Directories.
Jan 20 08:16:35 np0005588919 systemd[4302]: Reached target Basic System.
Jan 20 08:16:35 np0005588919 systemd[4302]: Reached target Main User Target.
Jan 20 08:16:35 np0005588919 systemd[4302]: Startup finished in 158ms.
Jan 20 08:16:35 np0005588919 systemd[1]: Started User Manager for UID 1000.
Jan 20 08:16:35 np0005588919 systemd[1]: Started Session 1 of User zuul.
Jan 20 08:16:36 np0005588919 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:38 np0005588919 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:16:40 np0005588919 python3[4414]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:47 np0005588919 python3[4472]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:48 np0005588919 python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 20 08:16:51 np0005588919 python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/GVaHfRonG9ohfZBeZsfGsPAY5Ua/gRcfFAsYYpV+pfGGgyLPk7GpRkk4pr+e8jNRtdcfblMAicASH+5mJlHBm4eUbFYKtcwEXZXv6pyuCU3Ecns8qj50vHni0ryqqxTyg09WqOLv2u9xctOgas5b8y8tPl7bs2/uwlGFud/NxTxRMamezw0jUgKB9f6nJj6TiaAzomayQwqBx0/0kk8Cc6o4JsrOc92YyIsAjs+grfO5gO6MLYaAFWaCv28+Yvj3G37RUIAILUpORm4vyFNvxLGV+iIKd8ZYqqV6cczJ2tM7MGlfjYz9lTXL7WHkY2Knel8HDycvHH85Ydujv3gyD8d/m+dy4VHhMoU3HR1Syxx5e1GxOjU6NV7ZtEMjYtqE6zUdCNY1zXUU4uGxxPK7dF2Zzx5ODWpS7ssrJVRsLzDPf1YiIyi/g3OHzO95EzucQchqJsVh3MJI8D/C2CjI432eipKKcQAYY9sD9/mpPwBqI0PKwfSGTpsps60NwhM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:16:51 np0005588919 python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:52 np0005588919 python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:16:52 np0005588919 python3[4732]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915011.712395-252-194271861623093/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=db0b5efc11684c95b5a6c3da9b48c4c5_id_rsa follow=False checksum=3ee7ffdf9f2bde9aa4c9d676d061c45199023a01 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:53 np0005588919 python3[4855]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:16:53 np0005588919 python3[4926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915013.2588813-307-146443798583455/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=db0b5efc11684c95b5a6c3da9b48c4c5_id_rsa.pub follow=False checksum=c665db3a39036994c79fbfd6a268cbf34e365958 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:55 np0005588919 python3[4974]: ansible-ping Invoked with data=pong
Jan 20 08:16:56 np0005588919 python3[4998]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:16:58 np0005588919 python3[5056]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 20 08:16:59 np0005588919 python3[5088]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:16:59 np0005588919 python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588919 python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588919 python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588919 python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:00 np0005588919 python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:02 np0005588919 python3[5234]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:03 np0005588919 python3[5312]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:03 np0005588919 python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915022.9187856-33-265290447230422/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:04 np0005588919 python3[5433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:04 np0005588919 python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588919 python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588919 python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588919 python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:05 np0005588919 python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588919 python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588919 python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:06 np0005588919 python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588919 python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588919 python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588919 python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:07 np0005588919 python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588919 python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588919 python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588919 python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:08 np0005588919 python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588919 python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588919 python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:09 np0005588919 python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588919 python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588919 python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588919 python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:10 np0005588919 python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:11 np0005588919 python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:11 np0005588919 python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:17:14 np0005588919 python3[6059]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 08:17:14 np0005588919 systemd[1]: Starting Time & Date Service...
Jan 20 08:17:14 np0005588919 systemd[1]: Started Time & Date Service.
Jan 20 08:17:14 np0005588919 systemd-timedated[6061]: Changed time zone to 'UTC' (UTC).
Jan 20 08:17:14 np0005588919 python3[6090]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:15 np0005588919 python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:15 np0005588919 python3[6237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1768915035.0741973-252-85085486583027/source _original_basename=tmp032vp1re follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:16 np0005588919 python3[6337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:16 np0005588919 python3[6408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768915035.9946544-302-84865711002240/source _original_basename=tmpbj4_07ef follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:17 np0005588919 python3[6510]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:17 np0005588919 python3[6583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768915037.221259-383-160566660996014/source _original_basename=tmpbeaznhtf follow=False checksum=7a82bff5b5e9039ad1ac15f6a7286925b777bf85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:18 np0005588919 python3[6631]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:18 np0005588919 python3[6657]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:19 np0005588919 python3[6737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:17:19 np0005588919 python3[6810]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915038.9616592-452-111445218516075/source _original_basename=tmpd8rwe82l follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:20 np0005588919 python3[6861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-d383-642d-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:17:20 np0005588919 python3[6889]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d383-642d-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 20 08:17:22 np0005588919 python3[6917]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:17:44 np0005588919 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 08:17:48 np0005588919 python3[6945]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:18:43 np0005588919 systemd[4302]: Starting Mark boot as successful...
Jan 20 08:18:43 np0005588919 systemd[4302]: Finished Mark boot as successful.
Jan 20 08:18:49 np0005588919 systemd-logind[783]: Session 1 logged out. Waiting for processes to exit.
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 20 08:18:55 np0005588919 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 20 08:18:55 np0005588919 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.5815] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:18:55 np0005588919 systemd-udevd[6948]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.5987] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6026] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6031] device (eth1): carrier: link connected
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6033] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6040] policy: auto-activating connection 'Wired connection 1' (3c0fe307-c5e5-33c6-a0c4-a240cdee9616)
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6045] device (eth1): Activation: starting connection 'Wired connection 1' (3c0fe307-c5e5-33c6-a0c4-a240cdee9616)
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6046] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6049] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6054] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:18:55 np0005588919 NetworkManager[856]: <info>  [1768915135.6059] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:18:57 np0005588919 systemd-logind[783]: New session 3 of user zuul.
Jan 20 08:18:57 np0005588919 systemd[1]: Started Session 3 of User zuul.
Jan 20 08:18:57 np0005588919 python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-d6d1-215a-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:19:07 np0005588919 python3[7059]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:19:08 np0005588919 python3[7132]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768915147.3703582-155-6186936698920/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=b4d6cf2779d273a35a4489895da20bdddd958aca backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:19:08 np0005588919 python3[7182]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:19:08 np0005588919 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 20 08:19:08 np0005588919 systemd[1]: Stopped Network Manager Wait Online.
Jan 20 08:19:08 np0005588919 systemd[1]: Stopping Network Manager Wait Online...
Jan 20 08:19:08 np0005588919 systemd[1]: Stopping Network Manager...
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6841] caught SIGTERM, shutting down normally.
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6855] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6855] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6856] dhcp4 (eth0): state changed no lease
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6858] manager: NetworkManager state is now CONNECTING
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6939] dhcp4 (eth1): canceled DHCP transaction
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.6939] dhcp4 (eth1): state changed no lease
Jan 20 08:19:08 np0005588919 NetworkManager[856]: <info>  [1768915148.7011] exiting (success)
Jan 20 08:19:08 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:19:08 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:19:08 np0005588919 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 20 08:19:08 np0005588919 systemd[1]: Stopped Network Manager.
Jan 20 08:19:08 np0005588919 systemd[1]: NetworkManager.service: Consumed 1.295s CPU time, 10.0M memory peak.
Jan 20 08:19:08 np0005588919 systemd[1]: Starting Network Manager...
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.7883] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:017a3b90-38ab-4863-8e66-991c4844fcc7)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.7885] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.7948] manager[0x5610a1ace000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:19:08 np0005588919 systemd[1]: Starting Hostname Service...
Jan 20 08:19:08 np0005588919 systemd[1]: Started Hostname Service.
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8925] hostname: hostname: using hostnamed
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8926] hostname: static hostname changed from (none) to "np0005588919.novalocal"
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8935] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8942] manager[0x5610a1ace000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8943] manager[0x5610a1ace000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8987] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8988] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8989] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8989] manager: Networking is enabled by state file
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8993] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.8999] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9037] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9052] dhcp: init: Using DHCP client 'internal'
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9057] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9064] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9072] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9085] device (lo): Activation: starting connection 'lo' (f1b29bda-3a6a-4be0-8c9c-6df9359cf4c4)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9095] device (eth0): carrier: link connected
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9104] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9112] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9113] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9124] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9137] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9147] device (eth1): carrier: link connected
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9154] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9162] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (3c0fe307-c5e5-33c6-a0c4-a240cdee9616) (indicated)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9163] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9170] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9181] device (eth1): Activation: starting connection 'Wired connection 1' (3c0fe307-c5e5-33c6-a0c4-a240cdee9616)
Jan 20 08:19:08 np0005588919 systemd[1]: Started Network Manager.
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9190] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9196] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9199] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9202] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9205] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9210] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9212] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9215] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9220] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9230] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9235] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9249] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9254] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9283] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9289] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9297] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9310] device (lo): Activation: successful, device activated.
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9327] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:19:08 np0005588919 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9417] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9442] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9445] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9452] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9456] device (eth0): Activation: successful, device activated.
Jan 20 08:19:08 np0005588919 NetworkManager[7192]: <info>  [1768915148.9464] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:19:09 np0005588919 python3[7267]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-d6d1-215a-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:19:19 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:19:38 np0005588919 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0175] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:19:54 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:19:54 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0535] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0541] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0547] device (eth1): Activation: successful, device activated.
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0557] manager: startup complete
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0559] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <warn>  [1768915194.0567] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0582] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0689] dhcp4 (eth1): canceled DHCP transaction
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0691] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0692] dhcp4 (eth1): state changed no lease
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0716] policy: auto-activating connection 'ci-private-network' (1877dc82-ca8e-52d6-b413-dd9d07823d2d)
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0723] device (eth1): Activation: starting connection 'ci-private-network' (1877dc82-ca8e-52d6-b413-dd9d07823d2d)
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0725] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0732] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0743] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0757] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0810] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0814] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:19:54 np0005588919 NetworkManager[7192]: <info>  [1768915194.0824] device (eth1): Activation: successful, device activated.
Jan 20 08:20:04 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:20:09 np0005588919 systemd[1]: session-3.scope: Deactivated successfully.
Jan 20 08:20:09 np0005588919 systemd[1]: session-3.scope: Consumed 1.763s CPU time.
Jan 20 08:20:09 np0005588919 systemd-logind[783]: Session 3 logged out. Waiting for processes to exit.
Jan 20 08:20:09 np0005588919 systemd-logind[783]: Removed session 3.
Jan 20 08:20:52 np0005588919 systemd-logind[783]: New session 4 of user zuul.
Jan 20 08:20:52 np0005588919 systemd[1]: Started Session 4 of User zuul.
Jan 20 08:20:52 np0005588919 python3[7378]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:20:52 np0005588919 python3[7451]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915252.179773-373-102212201474035/source _original_basename=tmpn2ooqnbo follow=False checksum=a06d82404ae9ae38c6111e54a4021096121ff7ef backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:20:56 np0005588919 systemd[1]: session-4.scope: Deactivated successfully.
Jan 20 08:20:56 np0005588919 systemd-logind[783]: Session 4 logged out. Waiting for processes to exit.
Jan 20 08:20:56 np0005588919 systemd-logind[783]: Removed session 4.
Jan 20 08:21:43 np0005588919 systemd[4302]: Created slice User Background Tasks Slice.
Jan 20 08:21:43 np0005588919 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Jan 20 08:21:43 np0005588919 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Jan 20 08:26:22 np0005588919 systemd-logind[783]: New session 5 of user zuul.
Jan 20 08:26:22 np0005588919 systemd[1]: Started Session 5 of User zuul.
Jan 20 08:26:22 np0005588919 python3[7510]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-67c0-97af-000000000ca4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:23 np0005588919 python3[7538]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588919 python3[7565]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588919 python3[7591]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:23 np0005588919 python3[7617]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:24 np0005588919 python3[7643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:24 np0005588919 python3[7721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:26:25 np0005588919 python3[7794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915584.5801682-365-201103252402798/source _original_basename=tmp5oxa2ha2 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:26:26 np0005588919 python3[7844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 08:26:26 np0005588919 systemd[1]: Reloading.
Jan 20 08:26:26 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:26:28 np0005588919 python3[7899]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 20 08:26:28 np0005588919 python3[7925]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588919 python3[7953]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588919 python3[7981]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:29 np0005588919 python3[8009]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:30 np0005588919 python3[8036]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-67c0-97af-000000000cab-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:26:30 np0005588919 python3[8066]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 20 08:26:34 np0005588919 systemd[1]: session-5.scope: Deactivated successfully.
Jan 20 08:26:34 np0005588919 systemd[1]: session-5.scope: Consumed 4.518s CPU time.
Jan 20 08:26:34 np0005588919 systemd-logind[783]: Session 5 logged out. Waiting for processes to exit.
Jan 20 08:26:34 np0005588919 systemd-logind[783]: Removed session 5.
Jan 20 08:26:36 np0005588919 systemd-logind[783]: New session 6 of user zuul.
Jan 20 08:26:36 np0005588919 systemd[1]: Started Session 6 of User zuul.
Jan 20 08:26:36 np0005588919 python3[8099]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 20 08:26:44 np0005588919 setsebool[8139]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 20 08:26:44 np0005588919 setsebool[8139]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 20 08:26:57 np0005588919 kernel: SELinux:  Converting 385 SID table entries...
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:26:57 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  Converting 388 SID table entries...
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:27:07 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:27:25 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 20 08:27:25 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:27:25 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:27:25 np0005588919 systemd[1]: Reloading.
Jan 20 08:27:25 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:27:25 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:27:30 np0005588919 python3[13062]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-45b7-a25c-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:27:31 np0005588919 kernel: evm: overlay not supported
Jan 20 08:27:31 np0005588919 systemd[4302]: Starting D-Bus User Message Bus...
Jan 20 08:27:31 np0005588919 dbus-broker-launch[13894]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 20 08:27:31 np0005588919 dbus-broker-launch[13894]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 20 08:27:31 np0005588919 systemd[4302]: Started D-Bus User Message Bus.
Jan 20 08:27:31 np0005588919 dbus-broker-lau[13894]: Ready
Jan 20 08:27:31 np0005588919 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 20 08:27:31 np0005588919 systemd[4302]: Created slice Slice /user.
Jan 20 08:27:31 np0005588919 systemd[4302]: podman-13874.scope: unit configures an IP firewall, but not running as root.
Jan 20 08:27:31 np0005588919 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Jan 20 08:27:31 np0005588919 systemd[4302]: Started podman-13874.scope.
Jan 20 08:27:31 np0005588919 systemd[4302]: Started podman-pause-51d78d5d.scope.
Jan 20 08:27:32 np0005588919 python3[14066]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.233:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.233:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:27:32 np0005588919 python3[14066]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 20 08:27:33 np0005588919 systemd[1]: session-6.scope: Deactivated successfully.
Jan 20 08:27:33 np0005588919 systemd[1]: session-6.scope: Consumed 46.854s CPU time.
Jan 20 08:27:33 np0005588919 systemd-logind[783]: Session 6 logged out. Waiting for processes to exit.
Jan 20 08:27:33 np0005588919 systemd-logind[783]: Removed session 6.
Jan 20 08:27:47 np0005588919 irqbalance[778]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 20 08:27:47 np0005588919 irqbalance[778]: IRQ 27 affinity is now unmanaged
Jan 20 08:27:58 np0005588919 systemd-logind[783]: New session 7 of user zuul.
Jan 20 08:27:58 np0005588919 systemd[1]: Started Session 7 of User zuul.
Jan 20 08:27:58 np0005588919 python3[24286]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:27:59 np0005588919 python3[24519]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:28:00 np0005588919 python3[24943]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005588919.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 20 08:28:00 np0005588919 python3[25234]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCxDwznlRTnwVs4thRw4BwvCKpJwxh+kvQMz22TwtomAycQeWyoRQIrmHPtGJrxAVwBD4Z4rIf2j/gxUloZVamc= zuul@np0005588917.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 20 08:28:01 np0005588919 python3[25484]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:28:01 np0005588919 python3[25775]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768915681.0524058-168-160457724055178/source _original_basename=tmprcct7y06 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:28:02 np0005588919 python3[26164]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 20 08:28:02 np0005588919 systemd[1]: Starting Hostname Service...
Jan 20 08:28:02 np0005588919 systemd[1]: Started Hostname Service.
Jan 20 08:28:02 np0005588919 systemd-hostnamed[26290]: Changed pretty hostname to 'compute-1'
Jan 20 08:28:02 np0005588919 systemd-hostnamed[26290]: Hostname set to <compute-1> (static)
Jan 20 08:28:02 np0005588919 NetworkManager[7192]: <info>  [1768915682.9095] hostname: static hostname changed from "np0005588919.novalocal" to "compute-1"
Jan 20 08:28:02 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:28:02 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:28:03 np0005588919 systemd[1]: session-7.scope: Deactivated successfully.
Jan 20 08:28:03 np0005588919 systemd[1]: session-7.scope: Consumed 2.395s CPU time.
Jan 20 08:28:03 np0005588919 systemd-logind[783]: Session 7 logged out. Waiting for processes to exit.
Jan 20 08:28:03 np0005588919 systemd-logind[783]: Removed session 7.
Jan 20 08:28:12 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:28:14 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:28:14 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:28:14 np0005588919 systemd[1]: man-db-cache-update.service: Consumed 57.700s CPU time.
Jan 20 08:28:14 np0005588919 systemd[1]: run-rbd08d54a4cf34a8ab2a2ef74cd4a0100.service: Deactivated successfully.
Jan 20 08:28:32 np0005588919 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:31:33 np0005588919 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 20 08:31:33 np0005588919 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 20 08:31:33 np0005588919 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 20 08:31:33 np0005588919 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 20 08:33:00 np0005588919 systemd-logind[783]: New session 8 of user zuul.
Jan 20 08:33:00 np0005588919 systemd[1]: Started Session 8 of User zuul.
Jan 20 08:33:01 np0005588919 python3[30015]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:33:03 np0005588919 python3[30131]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:03 np0005588919 python3[30204]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:03 np0005588919 python3[30230]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:04 np0005588919 python3[30303]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:04 np0005588919 python3[30329]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:04 np0005588919 python3[30402]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:05 np0005588919 python3[30428]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:05 np0005588919 python3[30501]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:05 np0005588919 python3[30527]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:06 np0005588919 python3[30600]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:06 np0005588919 python3[30626]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:07 np0005588919 python3[30699]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:07 np0005588919 python3[30725]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:33:07 np0005588919 python3[30798]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768915982.8691642-34066-233922643037694/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:33:20 np0005588919 python3[30846]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:38:20 np0005588919 systemd-logind[783]: Session 8 logged out. Waiting for processes to exit.
Jan 20 08:38:20 np0005588919 systemd[1]: session-8.scope: Deactivated successfully.
Jan 20 08:38:20 np0005588919 systemd[1]: session-8.scope: Consumed 5.699s CPU time.
Jan 20 08:38:20 np0005588919 systemd-logind[783]: Removed session 8.
Jan 20 08:45:43 np0005588919 systemd-logind[783]: New session 9 of user zuul.
Jan 20 08:45:43 np0005588919 systemd[1]: Started Session 9 of User zuul.
Jan 20 08:45:45 np0005588919 python3.9[31050]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:45:46 np0005588919 python3.9[31231]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:45:54 np0005588919 systemd[1]: session-9.scope: Deactivated successfully.
Jan 20 08:45:54 np0005588919 systemd[1]: session-9.scope: Consumed 8.316s CPU time.
Jan 20 08:45:54 np0005588919 systemd-logind[783]: Session 9 logged out. Waiting for processes to exit.
Jan 20 08:45:54 np0005588919 systemd-logind[783]: Removed session 9.
Jan 20 08:46:10 np0005588919 systemd-logind[783]: New session 10 of user zuul.
Jan 20 08:46:10 np0005588919 systemd[1]: Started Session 10 of User zuul.
Jan 20 08:46:11 np0005588919 python3.9[31446]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 20 08:46:12 np0005588919 python3.9[31620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:13 np0005588919 python3.9[31772]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:46:14 np0005588919 python3.9[31925]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:46:15 np0005588919 python3.9[32078]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:17 np0005588919 python3.9[32231]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:46:17 np0005588919 python3.9[32354]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768916776.5429342-178-141990721373655/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:18 np0005588919 python3.9[32506]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:19 np0005588919 python3.9[32662]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:46:20 np0005588919 python3.9[32814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:46:21 np0005588919 python3.9[32964]: ansible-ansible.builtin.service_facts Invoked
Jan 20 08:46:27 np0005588919 python3.9[33220]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:46:28 np0005588919 python3.9[33370]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:30 np0005588919 python3.9[33524]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:46:31 np0005588919 python3.9[33682]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:46:32 np0005588919 python3.9[33766]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:46:47 np0005588919 irqbalance[778]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 20 08:46:47 np0005588919 irqbalance[778]: IRQ 26 affinity is now unmanaged
Jan 20 08:47:32 np0005588919 systemd[1]: Reloading.
Jan 20 08:47:32 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:32 np0005588919 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 20 08:47:33 np0005588919 systemd[1]: Reloading.
Jan 20 08:47:33 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:33 np0005588919 systemd[1]: Starting dnf makecache...
Jan 20 08:47:33 np0005588919 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 20 08:47:33 np0005588919 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 20 08:47:33 np0005588919 systemd[1]: Reloading.
Jan 20 08:47:33 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:47:33 np0005588919 dnf[34141]: Failed determining last makecache time.
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-openstack-barbican-42b4c41831408a8e323 118 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 135 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-openstack-cinder-1c00d6490d88e436f26ef 143 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-python-stevedore-c4acc5639fd2329372142 155 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-python-cloudkitty-tests-tempest-2c80f8 159 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-os-refresh-config-9bfc52b5049be2d8de61 155 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 166 kB/s | 3.0 kB     00:00
Jan 20 08:47:33 np0005588919 dnf[34141]: delorean-python-designate-tests-tempest-347fdbc 160 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 08:47:34 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-glance-1fd12c29b339f30fe823e 166 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 162 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-manila-3c01b7181572c95dac462 152 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-python-whitebox-neutron-tests-tempest- 137 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-octavia-ba397f07a7331190208c 163 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-watcher-c014f81a8647287f6dcc 127 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-ansible-config_template-5ccaa22121a7ff 154 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 162 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-swift-dc98a8463506ac520c469a 203 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-python-tempestconf-8515371b7cceebd4282 188 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: delorean-openstack-heat-ui-013accbfd179753bc3f0 201 kB/s | 3.0 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: CentOS Stream 9 - BaseOS                         45 kB/s | 6.4 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: CentOS Stream 9 - AppStream                      60 kB/s | 6.8 kB     00:00
Jan 20 08:47:34 np0005588919 dnf[34141]: CentOS Stream 9 - CRB                            56 kB/s | 6.3 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: CentOS Stream 9 - Extras packages                32 kB/s | 7.3 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: dlrn-antelope-testing                            92 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: dlrn-antelope-build-deps                        130 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: centos9-rabbitmq                                108 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: centos9-storage                                 146 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: centos9-opstools                                110 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: NFV SIG OpenvSwitch                             122 kB/s | 3.0 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: repo-setup-centos-appstream                     176 kB/s | 4.4 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: repo-setup-centos-baseos                        167 kB/s | 3.9 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: repo-setup-centos-highavailability              163 kB/s | 3.9 kB     00:00
Jan 20 08:47:35 np0005588919 dnf[34141]: repo-setup-centos-powertools                     32 kB/s | 4.3 kB     00:00
Jan 20 08:47:36 np0005588919 dnf[34141]: Extra Packages for Enterprise Linux 9 - x86_64   27 kB/s |  32 kB     00:01
Jan 20 08:47:37 np0005588919 dnf[34141]: Extra Packages for Enterprise Linux 9 - x86_64   19 MB/s |  20 MB     00:01
Jan 20 08:47:46 np0005588919 dnf[34141]: Metadata cache created.
Jan 20 08:47:46 np0005588919 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 20 08:47:46 np0005588919 systemd[1]: Finished dnf makecache.
Jan 20 08:47:46 np0005588919 systemd[1]: dnf-makecache.service: Consumed 10.014s CPU time.
Jan 20 08:48:37 np0005588919 kernel: SELinux:  Converting 2724 SID table entries...
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:48:37 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:48:37 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 20 08:48:37 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:48:37 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:48:37 np0005588919 systemd[1]: Reloading.
Jan 20 08:48:37 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:48:38 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:48:39 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:48:39 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:48:39 np0005588919 systemd[1]: man-db-cache-update.service: Consumed 1.749s CPU time.
Jan 20 08:48:39 np0005588919 systemd[1]: run-r6a5fbdbfa27648a8b86a5be943ed0c38.service: Deactivated successfully.
Jan 20 08:48:39 np0005588919 python3.9[35472]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:48:41 np0005588919 python3.9[35754]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 20 08:48:42 np0005588919 python3.9[35906]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 20 08:48:46 np0005588919 python3.9[36061]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:48:47 np0005588919 python3.9[36213]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 20 08:48:52 np0005588919 python3.9[36365]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:48:53 np0005588919 python3.9[36519]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:48:55 np0005588919 python3.9[36642]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768916932.69029-667-212216292389921/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:48:58 np0005588919 python3.9[36794]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:48:59 np0005588919 python3.9[36946]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:48:59 np0005588919 python3.9[37099]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:49:01 np0005588919 python3.9[37251]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 20 08:49:01 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:49:02 np0005588919 python3.9[37405]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:03 np0005588919 python3.9[37563]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 08:49:04 np0005588919 python3.9[37723]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 20 08:49:05 np0005588919 python3.9[37876]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:06 np0005588919 python3.9[38034]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 20 08:49:07 np0005588919 python3.9[38186]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:49:09 np0005588919 python3.9[38339]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:10 np0005588919 python3.9[38491]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:49:11 np0005588919 python3.9[38614]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768916950.0478392-1025-140779314911048/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:12 np0005588919 python3.9[38766]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:49:12 np0005588919 systemd[1]: Starting Load Kernel Modules...
Jan 20 08:49:12 np0005588919 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 20 08:49:12 np0005588919 kernel: Bridge firewalling registered
Jan 20 08:49:12 np0005588919 systemd-modules-load[38770]: Inserted module 'br_netfilter'
Jan 20 08:49:12 np0005588919 systemd[1]: Finished Load Kernel Modules.
Jan 20 08:49:13 np0005588919 python3.9[38926]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:49:14 np0005588919 python3.9[39049]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768916952.7371955-1093-149557128141500/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:49:16 np0005588919 python3.9[39201]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:49:19 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 08:49:19 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 08:49:19 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:49:19 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:49:19 np0005588919 systemd[1]: Reloading.
Jan 20 08:49:20 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:20 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:49:21 np0005588919 python3.9[40737]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:49:22 np0005588919 python3.9[41593]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 20 08:49:23 np0005588919 python3.9[42291]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:49:24 np0005588919 python3.9[43158]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:24 np0005588919 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 08:49:24 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:49:24 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:49:24 np0005588919 systemd[1]: man-db-cache-update.service: Consumed 5.987s CPU time.
Jan 20 08:49:24 np0005588919 systemd[1]: run-r4c36d9d280bc45419937a582094d62d4.service: Deactivated successfully.
Jan 20 08:49:25 np0005588919 systemd[1]: Starting Authorization Manager...
Jan 20 08:49:25 np0005588919 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 08:49:25 np0005588919 polkitd[43590]: Started polkitd version 0.117
Jan 20 08:49:25 np0005588919 systemd[1]: Started Authorization Manager.
Jan 20 08:49:26 np0005588919 python3.9[43760]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:26 np0005588919 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 20 08:49:26 np0005588919 systemd[1]: tuned.service: Deactivated successfully.
Jan 20 08:49:26 np0005588919 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 20 08:49:26 np0005588919 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 08:49:26 np0005588919 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 08:49:27 np0005588919 python3.9[43921]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 20 08:49:31 np0005588919 python3.9[44073]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:31 np0005588919 systemd[1]: Reloading.
Jan 20 08:49:31 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:32 np0005588919 python3.9[44262]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:49:32 np0005588919 systemd[1]: Reloading.
Jan 20 08:49:32 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:49:33 np0005588919 python3.9[44452]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:34 np0005588919 python3.9[44605]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:34 np0005588919 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 20 08:49:35 np0005588919 python3.9[44758]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:37 np0005588919 python3.9[44920]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:49:38 np0005588919 python3.9[45073]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:49:38 np0005588919 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 20 08:49:38 np0005588919 systemd[1]: Stopped Apply Kernel Variables.
Jan 20 08:49:38 np0005588919 systemd[1]: Stopping Apply Kernel Variables...
Jan 20 08:49:38 np0005588919 systemd[1]: Starting Apply Kernel Variables...
Jan 20 08:49:38 np0005588919 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 20 08:49:38 np0005588919 systemd[1]: Finished Apply Kernel Variables.
Jan 20 08:49:38 np0005588919 systemd[1]: session-10.scope: Deactivated successfully.
Jan 20 08:49:38 np0005588919 systemd[1]: session-10.scope: Consumed 2min 23.427s CPU time.
Jan 20 08:49:38 np0005588919 systemd-logind[783]: Session 10 logged out. Waiting for processes to exit.
Jan 20 08:49:38 np0005588919 systemd-logind[783]: Removed session 10.
Jan 20 08:49:44 np0005588919 systemd-logind[783]: New session 11 of user zuul.
Jan 20 08:49:44 np0005588919 systemd[1]: Started Session 11 of User zuul.
Jan 20 08:49:45 np0005588919 python3.9[45257]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:49:46 np0005588919 python3.9[45413]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 20 08:49:47 np0005588919 python3.9[45566]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 08:49:49 np0005588919 python3.9[45724]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 08:49:50 np0005588919 python3.9[45884]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:49:51 np0005588919 python3.9[45968]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 08:49:54 np0005588919 python3.9[46132]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:05 np0005588919 kernel: SELinux:  Converting 2736 SID table entries...
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:50:05 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:50:05 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 20 08:50:05 np0005588919 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 20 08:50:07 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:07 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:07 np0005588919 systemd[1]: Reloading.
Jan 20 08:50:07 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:07 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:07 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:08 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:08 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:08 np0005588919 systemd[1]: run-rd218b33b4a364d0db5cd29f7f5e0ddfd.service: Deactivated successfully.
Jan 20 08:50:12 np0005588919 python3.9[47235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 08:50:12 np0005588919 systemd[1]: Reloading.
Jan 20 08:50:12 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:12 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:12 np0005588919 systemd[1]: Starting Open vSwitch Database Unit...
Jan 20 08:50:12 np0005588919 chown[47276]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 20 08:50:12 np0005588919 ovs-ctl[47281]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 20 08:50:12 np0005588919 ovs-ctl[47281]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 20 08:50:12 np0005588919 ovs-ctl[47281]: Starting ovsdb-server [  OK  ]
Jan 20 08:50:12 np0005588919 ovs-vsctl[47330]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 20 08:50:12 np0005588919 ovs-vsctl[47346]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"5ffd4ac3-9266-4927-98ad-20a17782c725\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 20 08:50:12 np0005588919 ovs-ctl[47281]: Configuring Open vSwitch system IDs [  OK  ]
Jan 20 08:50:12 np0005588919 ovs-ctl[47281]: Enabling remote OVSDB managers [  OK  ]
Jan 20 08:50:12 np0005588919 ovs-vsctl[47355]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 20 08:50:12 np0005588919 systemd[1]: Started Open vSwitch Database Unit.
Jan 20 08:50:12 np0005588919 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 20 08:50:12 np0005588919 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 20 08:50:12 np0005588919 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 20 08:50:13 np0005588919 kernel: openvswitch: Open vSwitch switching datapath
Jan 20 08:50:13 np0005588919 ovs-ctl[47399]: Inserting openvswitch module [  OK  ]
Jan 20 08:50:13 np0005588919 ovs-ctl[47368]: Starting ovs-vswitchd [  OK  ]
Jan 20 08:50:13 np0005588919 ovs-vsctl[47416]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 20 08:50:13 np0005588919 ovs-ctl[47368]: Enabling remote OVSDB managers [  OK  ]
Jan 20 08:50:13 np0005588919 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 20 08:50:13 np0005588919 systemd[1]: Starting Open vSwitch...
Jan 20 08:50:13 np0005588919 systemd[1]: Finished Open vSwitch.
Jan 20 08:50:14 np0005588919 python3.9[47568]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:50:15 np0005588919 python3.9[47720]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 20 08:50:16 np0005588919 kernel: SELinux:  Converting 2750 SID table entries...
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 08:50:16 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 08:50:17 np0005588919 python3.9[47875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:50:18 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 20 08:50:19 np0005588919 python3.9[48033]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:21 np0005588919 python3.9[48186]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:50:23 np0005588919 python3.9[48473]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 20 08:50:24 np0005588919 python3.9[48623]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:50:25 np0005588919 python3.9[48777]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:27 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:27 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:27 np0005588919 systemd[1]: Reloading.
Jan 20 08:50:27 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:27 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:27 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:27 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:27 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:27 np0005588919 systemd[1]: run-r7dcf85c38a6948f1b4947a7edeb4364f.service: Deactivated successfully.
Jan 20 08:50:28 np0005588919 python3.9[49094]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:50:28 np0005588919 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 20 08:50:28 np0005588919 systemd[1]: Stopped Network Manager Wait Online.
Jan 20 08:50:28 np0005588919 systemd[1]: Stopping Network Manager Wait Online...
Jan 20 08:50:28 np0005588919 systemd[1]: Stopping Network Manager...
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5795] caught SIGTERM, shutting down normally.
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5814] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5815] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5815] dhcp4 (eth0): state changed no lease
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5819] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:50:28 np0005588919 NetworkManager[7192]: <info>  [1768917028.5896] exiting (success)
Jan 20 08:50:28 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:50:28 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:50:28 np0005588919 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 20 08:50:28 np0005588919 systemd[1]: Stopped Network Manager.
Jan 20 08:50:28 np0005588919 systemd[1]: NetworkManager.service: Consumed 13.061s CPU time, 4.1M memory peak, read 0B from disk, written 14.0K to disk.
Jan 20 08:50:28 np0005588919 systemd[1]: Starting Network Manager...
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.6957] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:017a3b90-38ab-4863-8e66-991c4844fcc7)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.6958] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.7066] manager[0x55e28ae98000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 20 08:50:28 np0005588919 systemd[1]: Starting Hostname Service...
Jan 20 08:50:28 np0005588919 systemd[1]: Started Hostname Service.
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8252] hostname: hostname: using hostnamed
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8253] hostname: static hostname changed from (none) to "compute-1"
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8271] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8278] manager[0x55e28ae98000]: rfkill: Wi-Fi hardware radio set enabled
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8279] manager[0x55e28ae98000]: rfkill: WWAN hardware radio set enabled
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8319] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8335] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8336] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8337] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8339] manager: Networking is enabled by state file
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8343] settings: Loaded settings plugin: keyfile (internal)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8349] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8414] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8434] dhcp: init: Using DHCP client 'internal'
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8441] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8455] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8472] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8496] device (lo): Activation: starting connection 'lo' (f1b29bda-3a6a-4be0-8c9c-6df9359cf4c4)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8510] device (eth0): carrier: link connected
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8520] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8533] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8535] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8556] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8578] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8591] device (eth1): carrier: link connected
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8601] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8618] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1877dc82-ca8e-52d6-b413-dd9d07823d2d) (indicated)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8620] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8640] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8663] device (eth1): Activation: starting connection 'ci-private-network' (1877dc82-ca8e-52d6-b413-dd9d07823d2d)
Jan 20 08:50:28 np0005588919 systemd[1]: Started Network Manager.
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8674] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8699] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8711] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8721] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8750] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8756] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8760] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8765] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8770] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8780] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8786] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8801] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8825] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8842] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8846] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8853] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8861] device (lo): Activation: successful, device activated.
Jan 20 08:50:28 np0005588919 systemd[1]: Starting Network Manager Wait Online...
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8888] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8964] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8972] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8981] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8986] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.8990] device (eth1): Activation: successful, device activated.
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9009] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9011] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9015] manager: NetworkManager state is now CONNECTED_SITE
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9018] device (eth0): Activation: successful, device activated.
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9025] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 20 08:50:28 np0005588919 NetworkManager[49104]: <info>  [1768917028.9032] manager: startup complete
Jan 20 08:50:28 np0005588919 systemd[1]: Finished Network Manager Wait Online.
Jan 20 08:50:29 np0005588919 python3.9[49321]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:50:34 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 08:50:34 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 08:50:34 np0005588919 systemd[1]: Reloading.
Jan 20 08:50:34 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:50:34 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:50:34 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 08:50:35 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 08:50:35 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 08:50:35 np0005588919 systemd[1]: run-r671d03ad3b1a4ba9b1e1ebb3c823158b.service: Deactivated successfully.
Jan 20 08:50:36 np0005588919 python3.9[49780]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:50:38 np0005588919 python3.9[49934]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:39 np0005588919 python3.9[50088]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:39 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:50:39 np0005588919 python3.9[50241]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:40 np0005588919 python3.9[50393]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:41 np0005588919 python3.9[50546]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:42 np0005588919 python3.9[50698]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:50:42 np0005588919 python3.9[50821]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917041.6502037-648-273260332264767/.source _original_basename=.cwcydfts follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:43 np0005588919 python3.9[50973]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:44 np0005588919 python3.9[51125]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 20 08:50:45 np0005588919 python3.9[51277]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:50:47 np0005588919 python3.9[51704]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 20 08:50:49 np0005588919 ansible-async_wrapper.py[51879]: Invoked with j766640452522 300 /home/zuul/.ansible/tmp/ansible-tmp-1768917048.2391117-846-90092630877966/AnsiballZ_edpm_os_net_config.py _
Jan 20 08:50:49 np0005588919 ansible-async_wrapper.py[51882]: Starting module and watcher
Jan 20 08:50:49 np0005588919 ansible-async_wrapper.py[51882]: Start watching 51883 (300)
Jan 20 08:50:49 np0005588919 ansible-async_wrapper.py[51883]: Start module (51883)
Jan 20 08:50:49 np0005588919 ansible-async_wrapper.py[51879]: Return async_wrapper task started.
Jan 20 08:50:49 np0005588919 python3.9[51884]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 20 08:50:50 np0005588919 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 20 08:50:50 np0005588919 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 20 08:50:50 np0005588919 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 20 08:50:50 np0005588919 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 20 08:50:50 np0005588919 kernel: cfg80211: failed to load regulatory.db
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3528] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3543] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3969] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3970] audit: op="connection-add" uuid="b1dcd61b-a54a-4d09-b592-1e42a44a5f87" name="br-ex-br" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3988] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.3989] audit: op="connection-add" uuid="d9d8c26f-c8c6-4619-be80-65fe1e8ed035" name="br-ex-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4002] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4003] audit: op="connection-add" uuid="62009706-568c-4850-95a2-13f3e778a8c3" name="eth1-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4016] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4017] audit: op="connection-add" uuid="312f7ee9-fe7f-44fe-9a04-524d4d678983" name="vlan20-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4030] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4031] audit: op="connection-add" uuid="402b6a37-e14a-4c91-93ac-70460eb4676d" name="vlan21-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4044] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4046] audit: op="connection-add" uuid="a22b3790-47fd-4571-9ebd-90f4fda173a4" name="vlan22-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4058] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4060] audit: op="connection-add" uuid="f12405d4-5941-409c-a742-833c3119839b" name="vlan23-port" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4080] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4097] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4098] audit: op="connection-add" uuid="bffb5486-bddd-47a6-b940-1f626fe731a0" name="br-ex-if" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4131] audit: op="connection-update" uuid="1877dc82-ca8e-52d6-b413-dd9d07823d2d" name="ci-private-network" args="connection.port-type,connection.master,connection.timestamp,connection.controller,connection.slave-type,ovs-interface.type,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.routes,ipv4.never-default,ovs-external-ids.data,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.dns,ipv6.method,ipv6.addresses,ipv6.routes" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4147] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4149] audit: op="connection-add" uuid="93b33f82-f072-4e05-bf59-36be1960102b" name="vlan20-if" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4165] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4167] audit: op="connection-add" uuid="f30f8036-b523-4503-b0e1-ac5fe3a30f91" name="vlan21-if" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4183] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4185] audit: op="connection-add" uuid="a01ef58e-1207-40a7-95e9-e2335d145b40" name="vlan22-if" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4201] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4203] audit: op="connection-add" uuid="4e1f04f6-553e-40ff-a5c0-b9478175e86f" name="vlan23-if" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4215] audit: op="connection-delete" uuid="3c0fe307-c5e5-33c6-a0c4-a240cdee9616" name="Wired connection 1" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4227] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4229] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4236] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4240] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (b1dcd61b-a54a-4d09-b592-1e42a44a5f87)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4241] audit: op="connection-activate" uuid="b1dcd61b-a54a-4d09-b592-1e42a44a5f87" name="br-ex-br" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4243] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4244] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4249] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4254] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (d9d8c26f-c8c6-4619-be80-65fe1e8ed035)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4256] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4258] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4263] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4268] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (62009706-568c-4850-95a2-13f3e778a8c3)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4270] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4271] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4277] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4282] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (312f7ee9-fe7f-44fe-9a04-524d4d678983)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4284] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4285] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4291] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4295] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (402b6a37-e14a-4c91-93ac-70460eb4676d)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4296] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4297] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4303] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4307] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (a22b3790-47fd-4571-9ebd-90f4fda173a4)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4310] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4311] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4317] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4321] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f12405d4-5941-409c-a742-833c3119839b)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4322] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4325] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4327] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4333] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4334] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4337] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4342] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (bffb5486-bddd-47a6-b940-1f626fe731a0)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4343] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4346] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4348] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4350] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4351] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4362] device (eth1): disconnecting for new activation request.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4363] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4366] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4367] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4368] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4370] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4371] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4374] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4377] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (93b33f82-f072-4e05-bf59-36be1960102b)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4378] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4380] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4382] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4383] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4385] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4386] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4388] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4392] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f30f8036-b523-4503-b0e1-ac5fe3a30f91)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4392] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4395] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4396] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4397] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4399] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4400] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4403] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4406] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (a01ef58e-1207-40a7-95e9-e2335d145b40)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4407] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4409] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4411] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4411] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4414] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <warn>  [1768917051.4414] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4417] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4420] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4e1f04f6-553e-40ff-a5c0-b9478175e86f)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4421] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4423] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4425] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4426] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4427] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4439] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4440] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4443] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4444] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4455] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4458] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4461] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4463] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4464] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4467] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 kernel: ovs-system: entered promiscuous mode
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4478] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4481] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4482] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4485] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4487] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4490] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4491] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4495] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4497] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4500] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 systemd-udevd[51890]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588919 kernel: Timeout policy base is empty
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4502] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4506] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4509] dhcp4 (eth0): canceled DHCP transaction
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4510] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4510] dhcp4 (eth0): state changed no lease
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4511] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4522] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4528] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51885 uid=0 result="fail" reason="Device is not activated"
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4532] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4538] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4545] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4552] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4554] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4596] device (eth1): disconnecting for new activation request.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4597] audit: op="connection-activate" uuid="1877dc82-ca8e-52d6-b413-dd9d07823d2d" name="ci-private-network" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4652] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51885 uid=0 result="success"
Jan 20 08:50:51 np0005588919 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4731] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4831] device (eth1): Activation: starting connection 'ci-private-network' (1877dc82-ca8e-52d6-b413-dd9d07823d2d)
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4837] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4847] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4852] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4859] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4862] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4867] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4869] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4871] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4872] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4873] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4875] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4889] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4901] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4905] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4909] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4912] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4916] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4920] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4925] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4929] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4934] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4937] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4942] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4945] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 kernel: br-ex: entered promiscuous mode
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4950] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.4954] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5027] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5029] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5035] device (eth1): Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 kernel: vlan22: entered promiscuous mode
Jan 20 08:50:51 np0005588919 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 20 08:50:51 np0005588919 systemd-udevd[51889]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5121] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5130] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 kernel: vlan20: entered promiscuous mode
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5171] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5173] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5178] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5236] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5250] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 kernel: vlan23: entered promiscuous mode
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5278] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588919 systemd-udevd[51891]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5296] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5301] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5313] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5320] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 kernel: vlan21: entered promiscuous mode
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5394] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5407] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5416] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5467] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5474] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5490] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5504] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5513] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5515] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5519] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5526] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5527] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 20 08:50:51 np0005588919 NetworkManager[49104]: <info>  [1768917051.5532] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 20 08:50:52 np0005588919 NetworkManager[49104]: <info>  [1768917052.6640] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51885 uid=0 result="success"
Jan 20 08:50:52 np0005588919 NetworkManager[49104]: <info>  [1768917052.8685] checkpoint[0x55e28ae6c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 20 08:50:52 np0005588919 NetworkManager[49104]: <info>  [1768917052.8687] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51885 uid=0 result="success"
Jan 20 08:50:52 np0005588919 python3.9[52242]: ansible-ansible.legacy.async_status Invoked with jid=j766640452522.51879 mode=status _async_dir=/root/.ansible_async
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.2526] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.2546] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.5714] audit: op="networking-control" arg="global-dns-configuration" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.5757] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.5799] audit: op="networking-control" arg="global-dns-configuration" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.5830] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.7125] checkpoint[0x55e28ae6ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 20 08:50:53 np0005588919 NetworkManager[49104]: <info>  [1768917053.7128] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51885 uid=0 result="success"
Jan 20 08:50:53 np0005588919 ansible-async_wrapper.py[51883]: Module complete (51883)
Jan 20 08:50:54 np0005588919 ansible-async_wrapper.py[51882]: Done in kid B.
Jan 20 08:50:56 np0005588919 python3.9[52348]: ansible-ansible.legacy.async_status Invoked with jid=j766640452522.51879 mode=status _async_dir=/root/.ansible_async
Jan 20 08:50:58 np0005588919 python3.9[52448]: ansible-ansible.legacy.async_status Invoked with jid=j766640452522.51879 mode=cleanup _async_dir=/root/.ansible_async
Jan 20 08:50:58 np0005588919 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 20 08:50:58 np0005588919 python3.9[52600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:50:59 np0005588919 python3.9[52726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917058.345282-927-57926163870785/.source.returncode _original_basename=.i64e69rx follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:00 np0005588919 python3.9[52878]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:01 np0005588919 python3.9[53001]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917060.2055712-975-227172950204853/.source.cfg _original_basename=.8p5lpu28 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:02 np0005588919 python3.9[53153]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:51:02 np0005588919 systemd[1]: Reloading Network Manager...
Jan 20 08:51:02 np0005588919 NetworkManager[49104]: <info>  [1768917062.4247] audit: op="reload" arg="0" pid=53157 uid=0 result="success"
Jan 20 08:51:02 np0005588919 NetworkManager[49104]: <info>  [1768917062.4258] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 20 08:51:02 np0005588919 systemd[1]: Reloaded Network Manager.
Jan 20 08:51:02 np0005588919 systemd[1]: session-11.scope: Deactivated successfully.
Jan 20 08:51:02 np0005588919 systemd[1]: session-11.scope: Consumed 54.490s CPU time.
Jan 20 08:51:02 np0005588919 systemd-logind[783]: Session 11 logged out. Waiting for processes to exit.
Jan 20 08:51:02 np0005588919 systemd-logind[783]: Removed session 11.
Jan 20 08:51:08 np0005588919 systemd-logind[783]: New session 12 of user zuul.
Jan 20 08:51:08 np0005588919 systemd[1]: Started Session 12 of User zuul.
Jan 20 08:51:09 np0005588919 python3.9[53341]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:10 np0005588919 python3.9[53496]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:12 np0005588919 python3.9[53689]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:12 np0005588919 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 20 08:51:12 np0005588919 systemd[1]: session-12.scope: Deactivated successfully.
Jan 20 08:51:12 np0005588919 systemd[1]: session-12.scope: Consumed 2.835s CPU time.
Jan 20 08:51:12 np0005588919 systemd-logind[783]: Session 12 logged out. Waiting for processes to exit.
Jan 20 08:51:12 np0005588919 systemd-logind[783]: Removed session 12.
Jan 20 08:51:18 np0005588919 systemd-logind[783]: New session 13 of user zuul.
Jan 20 08:51:18 np0005588919 systemd[1]: Started Session 13 of User zuul.
Jan 20 08:51:19 np0005588919 python3.9[53874]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:20 np0005588919 python3.9[54028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:22 np0005588919 python3.9[54184]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:22 np0005588919 python3.9[54269]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:24 np0005588919 python3.9[54422]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:26 np0005588919 python3.9[54617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:27 np0005588919 python3.9[54769]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:27 np0005588919 systemd[1]: var-lib-containers-storage-overlay-compat1421343422-merged.mount: Deactivated successfully.
Jan 20 08:51:28 np0005588919 podman[54770]: 2026-01-20 13:51:28.175347664 +0000 UTC m=+0.906507583 system refresh
Jan 20 08:51:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:51:29 np0005588919 python3.9[54932]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:29 np0005588919 python3.9[55055]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917088.4544141-198-41018304539283/.source.json follow=False _original_basename=podman_network_config.j2 checksum=58244ca78685d57d88d55cdba14b67b67cf92d49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:30 np0005588919 python3.9[55207]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:31 np0005588919 python3.9[55330]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917090.164239-243-121851674105802/.source.conf follow=False _original_basename=registries.conf.j2 checksum=88781afee5b5da15b4e5a77559a69fa53d49a457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:32 np0005588919 python3.9[55482]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:33 np0005588919 python3.9[55634]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:34 np0005588919 python3.9[55786]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:34 np0005588919 python3.9[55938]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:51:35 np0005588919 python3.9[56090]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:38 np0005588919 python3.9[56243]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:51:38 np0005588919 python3.9[56397]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:51:39 np0005588919 python3.9[56549]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:51:40 np0005588919 python3.9[56701]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:51:42 np0005588919 python3.9[56854]: ansible-service_facts Invoked
Jan 20 08:51:42 np0005588919 network[56871]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 08:51:42 np0005588919 network[56872]: 'network-scripts' will be removed from distribution in near future.
Jan 20 08:51:42 np0005588919 network[56873]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 08:51:48 np0005588919 python3.9[57327]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 08:51:50 np0005588919 python3.9[57480]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 20 08:51:52 np0005588919 python3.9[57632]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:52 np0005588919 python3.9[57757]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917111.767435-675-57503337461256/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:53 np0005588919 python3.9[57911]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:51:54 np0005588919 python3.9[58036]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917113.284382-721-93064516323419/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:56 np0005588919 python3.9[58190]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:51:58 np0005588919 python3.9[58344]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:51:59 np0005588919 python3.9[58428]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:01 np0005588919 python3.9[58582]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:52:01 np0005588919 python3.9[58666]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:52:01 np0005588919 chronyd[796]: chronyd exiting
Jan 20 08:52:01 np0005588919 systemd[1]: Stopping NTP client/server...
Jan 20 08:52:01 np0005588919 systemd[1]: chronyd.service: Deactivated successfully.
Jan 20 08:52:01 np0005588919 systemd[1]: Stopped NTP client/server.
Jan 20 08:52:01 np0005588919 systemd[1]: Starting NTP client/server...
Jan 20 08:52:02 np0005588919 chronyd[58675]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 20 08:52:02 np0005588919 chronyd[58675]: Frequency -28.387 +/- 0.221 ppm read from /var/lib/chrony/drift
Jan 20 08:52:02 np0005588919 chronyd[58675]: Loaded seccomp filter (level 2)
Jan 20 08:52:02 np0005588919 systemd[1]: Started NTP client/server.
Jan 20 08:52:02 np0005588919 systemd[1]: session-13.scope: Deactivated successfully.
Jan 20 08:52:02 np0005588919 systemd[1]: session-13.scope: Consumed 29.746s CPU time.
Jan 20 08:52:02 np0005588919 systemd-logind[783]: Session 13 logged out. Waiting for processes to exit.
Jan 20 08:52:02 np0005588919 systemd-logind[783]: Removed session 13.
Jan 20 08:52:08 np0005588919 systemd-logind[783]: New session 14 of user zuul.
Jan 20 08:52:08 np0005588919 systemd[1]: Started Session 14 of User zuul.
Jan 20 08:52:09 np0005588919 python3.9[58856]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:10 np0005588919 python3.9[59008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:11 np0005588919 python3.9[59131]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917129.5197217-63-280393526920088/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:11 np0005588919 systemd[1]: session-14.scope: Deactivated successfully.
Jan 20 08:52:11 np0005588919 systemd[1]: session-14.scope: Consumed 2.076s CPU time.
Jan 20 08:52:11 np0005588919 systemd-logind[783]: Session 14 logged out. Waiting for processes to exit.
Jan 20 08:52:11 np0005588919 systemd-logind[783]: Removed session 14.
Jan 20 08:52:17 np0005588919 systemd-logind[783]: New session 15 of user zuul.
Jan 20 08:52:17 np0005588919 systemd[1]: Started Session 15 of User zuul.
Jan 20 08:52:18 np0005588919 python3.9[59311]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:52:20 np0005588919 python3.9[59467]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:21 np0005588919 python3.9[59642]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:22 np0005588919 python3.9[59765]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1768917141.1259809-84-263204010172042/.source.json _original_basename=.jifsnfn6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:23 np0005588919 python3.9[59917]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:24 np0005588919 python3.9[60040]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917143.0965374-153-15111787750233/.source _original_basename=.efjns574 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:25 np0005588919 python3.9[60192]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:26 np0005588919 python3.9[60344]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:26 np0005588919 python3.9[60467]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917145.447901-225-272548893833124/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:27 np0005588919 python3.9[60619]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:28 np0005588919 python3.9[60742]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768917146.9060292-225-52445573130770/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 08:52:28 np0005588919 python3.9[60894]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:29 np0005588919 python3.9[61046]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:30 np0005588919 python3.9[61169]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917149.242158-336-217506568850933/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:31 np0005588919 python3.9[61321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:31 np0005588919 python3.9[61444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917150.6350958-381-45719686309721/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:32 np0005588919 python3.9[61596]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:32 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:33 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:33 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:33 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:33 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:33 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:33 np0005588919 systemd[1]: Starting EDPM Container Shutdown...
Jan 20 08:52:33 np0005588919 systemd[1]: Finished EDPM Container Shutdown.
Jan 20 08:52:34 np0005588919 python3.9[61823]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:35 np0005588919 python3.9[61946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917153.7644513-450-127006310060755/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:36 np0005588919 python3.9[62098]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:36 np0005588919 python3.9[62221]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917155.298333-495-14797447530989/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:37 np0005588919 python3.9[62373]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:37 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:37 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:38 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:39 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:39 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:39 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:39 np0005588919 systemd[1]: Starting Create netns directory...
Jan 20 08:52:39 np0005588919 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 08:52:39 np0005588919 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 08:52:39 np0005588919 systemd[1]: Finished Create netns directory.
Jan 20 08:52:40 np0005588919 python3.9[62600]: ansible-ansible.builtin.service_facts Invoked
Jan 20 08:52:40 np0005588919 network[62617]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 08:52:40 np0005588919 network[62618]: 'network-scripts' will be removed from distribution in near future.
Jan 20 08:52:40 np0005588919 network[62619]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 08:52:46 np0005588919 python3.9[62881]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:46 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:46 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:46 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:46 np0005588919 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 20 08:52:46 np0005588919 iptables.init[62922]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 20 08:52:47 np0005588919 iptables.init[62922]: iptables: Flushing firewall rules: [  OK  ]
Jan 20 08:52:47 np0005588919 systemd[1]: iptables.service: Deactivated successfully.
Jan 20 08:52:47 np0005588919 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 20 08:52:47 np0005588919 python3.9[63118]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:49 np0005588919 python3.9[63272]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:52:49 np0005588919 systemd[1]: Reloading.
Jan 20 08:52:49 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:52:49 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:52:49 np0005588919 systemd[1]: Starting Netfilter Tables...
Jan 20 08:52:49 np0005588919 systemd[1]: Finished Netfilter Tables.
Jan 20 08:52:51 np0005588919 python3.9[63464]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:52:52 np0005588919 python3.9[63617]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:53 np0005588919 python3.9[63742]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917172.3422544-702-67871139232021/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:54 np0005588919 python3.9[63895]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:52:54 np0005588919 systemd[1]: Reloading OpenSSH server daemon...
Jan 20 08:52:54 np0005588919 systemd[1]: Reloaded OpenSSH server daemon.
Jan 20 08:52:55 np0005588919 python3.9[64051]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:56 np0005588919 python3.9[64203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:52:56 np0005588919 python3.9[64326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917175.729682-795-88754883137305/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:58 np0005588919 python3.9[64478]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 08:52:58 np0005588919 systemd[1]: Starting Time & Date Service...
Jan 20 08:52:58 np0005588919 systemd[1]: Started Time & Date Service.
Jan 20 08:52:59 np0005588919 python3.9[64634]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:52:59 np0005588919 python3.9[64786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:00 np0005588919 python3.9[64909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917179.4153807-900-255969689895348/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:01 np0005588919 python3.9[65061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:02 np0005588919 python3.9[65184]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917180.8971012-945-70868812061712/.source.yaml _original_basename=.63o7k_i0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:02 np0005588919 python3.9[65336]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:03 np0005588919 python3.9[65459]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917182.354719-990-174925750156871/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:04 np0005588919 python3.9[65613]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:05 np0005588919 python3.9[65766]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:06 np0005588919 python3[65919]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 08:53:07 np0005588919 python3.9[66071]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:07 np0005588919 python3.9[66194]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917186.4221895-1107-163586857766802/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:08 np0005588919 python3.9[66346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:09 np0005588919 python3.9[66469]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917187.9927852-1152-250036095852991/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:10 np0005588919 python3.9[66621]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:10 np0005588919 python3.9[66744]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917189.4818597-1197-266182474889818/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:11 np0005588919 python3.9[66896]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:12 np0005588919 python3.9[67019]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917190.8865685-1242-59139520834111/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:12 np0005588919 python3.9[67171]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 08:53:13 np0005588919 python3.9[67294]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917192.415442-1287-155976059032591/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:14 np0005588919 python3.9[67448]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:15 np0005588919 python3.9[67600]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:16 np0005588919 python3.9[67759]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:17 np0005588919 python3.9[67912]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:18 np0005588919 python3.9[68064]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:19 np0005588919 python3.9[68216]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 08:53:20 np0005588919 python3.9[68369]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 08:53:20 np0005588919 systemd[1]: session-15.scope: Deactivated successfully.
Jan 20 08:53:20 np0005588919 systemd[1]: session-15.scope: Consumed 44.106s CPU time.
Jan 20 08:53:20 np0005588919 systemd-logind[783]: Session 15 logged out. Waiting for processes to exit.
Jan 20 08:53:20 np0005588919 systemd-logind[783]: Removed session 15.
Jan 20 08:53:28 np0005588919 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 08:53:33 np0005588919 systemd-logind[783]: New session 16 of user zuul.
Jan 20 08:53:33 np0005588919 systemd[1]: Started Session 16 of User zuul.
Jan 20 08:53:33 np0005588919 python3.9[68554]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 20 08:53:34 np0005588919 python3.9[68706]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:53:35 np0005588919 python3.9[68858]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:53:37 np0005588919 python3.9[69010]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrCUasX8PhlctvvIb2eE6+Z0hELmfczQ6UoBD+mPtCobptr/s786JmwJ3D8nIoKhlCLVSmhRfbqf1Pm45RUPTEtSuaa6HBDy40dZhTXU34X4KbGfKmur2bp9S/1w83ArKvI8inSqqk2qoMx1l7ECkEgeT+GbFwKfYLnbq5OV4Ms3tzl/uFUC/Xzxs2dbXlhozQiSamcO/a6EObErTvR8PrtaOoLFtTiD/I+oN+rkdBPkBc6r0qT4jS7nU1FOlT96meSZHE7Q1n8pxcy9PEc8w9hFdd1Zj8/WcGIdeEJsekuouK1Lut/sofQLZHyUMWJTcnBjx8BsjGx9NjUHPYUWIw+DZo7lT2QurAPNnaX4rp9ciGV2Bdm3ylNoOu3izNvM1JGTw3xRyYrmyxyWv3Euc35JXa0w07Xrqr+6Ckih0WTLU6q3Rlnrc/grpDC821sHrsljerHipJVOCbZB39LvV6wDDBlqfYZzfqID3dIqlVli4eL12J0K7jr7QAlPRhNf0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOG07miJwhzuA/nm0wvGIorydl2xbBiiDhE7PypnJ/jC#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKiJpWtps/bRsuEHfak4zDuqPHKOWFLaEA2h86H7tPlrZHR8okAVZWCmY7keO3Ad1DFyffUtJPKv5OvTK91xGO8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/3N9PJZXpat0uFh2x2RoV9B0Ih74HU9CPf+g/5HncM7gCVvCpW3CBde1qNDRU2iY9rzpOVPwzi4YzoAUcxB5KAiqZOI9ylmzfiD8JXQ+myLmIRLxHOdXFaEQ4mMp4W+X37hCZ6sdfm6Yqd6eqBuZrM/72ltYoewWBNCG/Hgqzu30L9WC4+BF+iADHT7Qnmvh/cc9U71WxB4h2ikBo1SdGoFCqoez7ajitqx+dw7VWaOtEPliS0LZuDtN3Zt/cBBgxhb/FaAEI3jRP2ej9X0NJW91YxzBygyxiVasslx92g/GmnDFOWVZb5ai/JJsNH6pLTjs25IzvnuWIf8/ZLgZ03zziR4mBLP12CIVF8g1CzaqK1IILDKkjS/dzDiTBefmiQ2+N0i5EEXOgmxchqOqTkFPQg/ar0+0uBPkwzAI0HDk99czhyYHFlO+PhnULVkL1z+XLwHBgOrbNNVQQcJCvady4Gadh66mu1UrLpryNYOgZiugZi67Biha4ZPzPHok=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF41dx3BXAuEvQwQNtbUM7rIrbaOLr5CRvYNdDD+UMr9#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENFrTpm22/xEaEJMzd7C5WyJttJdK+HK5kxP8/NuvvAQSlLtEulBZnvD/OX5hk3/sDYhPQelj3YsNX1Plw5PJQ=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/dEYtIJ/delwiq9xMMctU8myoGU/TKMiFUM+i3BSaGKrC0rujad6qo1LAtjth5aYbBcgBhxy0UEX0oCruQQgc5qDpPmWHmJiAwdQJaDu6GxTRl3PlXF2u4rd0Rz72DAMuCxPSYedeHU91uL4vlrcD95xONWew2wa9lUuqQWdgj8DtqnB9T895BihDk9vFLXAaoGJcYZVGKJmXR8sOzNTFQxefqstVO0/dfbRUyFd0Ukp5v7rTmLxw0Np5WcGMOg9l/iRzWTopxnTRvXpBoGlFCmzNvTG2uH08dJ4FU5Wk9/iSxonuiVJu9DKs8Tp4EajaA4Y6cEuZiMhhqi7vw6zVCQuCmRBpny6Ub1Ag2CesMYgxwOVJO5cHsKh3BzuPFsh1gMgrrZK7v+qfm2r1rhHlPsCWrcnrtUIZa7gyzdFvHytTh/4uyGMgNpbwxkyCxgSN4PleQy2wvxy/DFW+JxCDzI4jK9LFH5aojzEhUtj+P3E7CXL/wRPxDJdfEU6PhTk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEa1zL0TUD00vr72wZq3y4rgtSnctWBvs+gME/0/EAsV#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI2WwWe4rQW0CaFwcmci1J5n144T87fcxCH+Y2CVZd5XQ7Cvzlhh1cGNDX81Tng3KgxvKOuz3mdiSCLqx8noiD0=#012 create=True mode=0644 path=/tmp/ansible.fmhhwyw3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:38 np0005588919 python3.9[69162]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.fmhhwyw3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:39 np0005588919 python3.9[69316]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.fmhhwyw3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:40 np0005588919 systemd[1]: session-16.scope: Deactivated successfully.
Jan 20 08:53:40 np0005588919 systemd[1]: session-16.scope: Consumed 4.010s CPU time.
Jan 20 08:53:40 np0005588919 systemd-logind[783]: Session 16 logged out. Waiting for processes to exit.
Jan 20 08:53:40 np0005588919 systemd-logind[783]: Removed session 16.
Jan 20 08:53:46 np0005588919 systemd-logind[783]: New session 17 of user zuul.
Jan 20 08:53:46 np0005588919 systemd[1]: Started Session 17 of User zuul.
Jan 20 08:53:47 np0005588919 python3.9[69496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:53:49 np0005588919 python3.9[69652]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 08:53:50 np0005588919 python3.9[69806]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 08:53:51 np0005588919 python3.9[69959]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:53 np0005588919 python3.9[70112]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:53:54 np0005588919 python3.9[70266]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:53:55 np0005588919 python3.9[70421]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:53:55 np0005588919 systemd[1]: session-17.scope: Deactivated successfully.
Jan 20 08:53:55 np0005588919 systemd[1]: session-17.scope: Consumed 5.609s CPU time.
Jan 20 08:53:55 np0005588919 systemd-logind[783]: Session 17 logged out. Waiting for processes to exit.
Jan 20 08:53:55 np0005588919 systemd-logind[783]: Removed session 17.
Jan 20 08:54:01 np0005588919 systemd-logind[783]: New session 18 of user zuul.
Jan 20 08:54:01 np0005588919 systemd[1]: Started Session 18 of User zuul.
Jan 20 08:54:02 np0005588919 python3.9[70599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:54:03 np0005588919 python3.9[70755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 08:54:04 np0005588919 python3.9[70839]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 08:54:06 np0005588919 python3.9[70990]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:08 np0005588919 python3.9[71141]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 08:54:09 np0005588919 python3.9[71291]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:54:09 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:54:09 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 08:54:09 np0005588919 python3.9[71442]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 08:54:10 np0005588919 systemd[1]: session-18.scope: Deactivated successfully.
Jan 20 08:54:10 np0005588919 systemd[1]: session-18.scope: Consumed 6.738s CPU time.
Jan 20 08:54:10 np0005588919 systemd-logind[783]: Session 18 logged out. Waiting for processes to exit.
Jan 20 08:54:10 np0005588919 systemd-logind[783]: Removed session 18.
Jan 20 08:54:10 np0005588919 chronyd[58675]: Selected source 23.159.16.194 (pool.ntp.org)
Jan 20 08:54:19 np0005588919 systemd-logind[783]: New session 19 of user zuul.
Jan 20 08:54:19 np0005588919 systemd[1]: Started Session 19 of User zuul.
Jan 20 08:54:26 np0005588919 python3[72208]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:54:28 np0005588919 python3[72303]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 20 08:54:29 np0005588919 python3[72330]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 20 08:54:30 np0005588919 python3[72356]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:30 np0005588919 kernel: loop: module loaded
Jan 20 08:54:30 np0005588919 kernel: loop3: detected capacity change from 0 to 14680064
Jan 20 08:54:30 np0005588919 python3[72391]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:54:30 np0005588919 lvm[72394]: PV /dev/loop3 not used.
Jan 20 08:54:30 np0005588919 lvm[72396]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:30 np0005588919 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 20 08:54:30 np0005588919 lvm[72402]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 20 08:54:30 np0005588919 lvm[72406]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:30 np0005588919 lvm[72406]: VG ceph_vg0 finished
Jan 20 08:54:30 np0005588919 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 20 08:54:31 np0005588919 python3[72484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 20 08:54:31 np0005588919 python3[72557]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768917271.0207577-36958-49779570935231/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 08:54:32 np0005588919 python3[72607]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 08:54:32 np0005588919 systemd[1]: Reloading.
Jan 20 08:54:32 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:54:32 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:54:33 np0005588919 systemd[1]: Starting Ceph OSD losetup...
Jan 20 08:54:33 np0005588919 bash[72647]: /dev/loop3: [64513]:4328448 (/var/lib/ceph-osd-0.img)
Jan 20 08:54:33 np0005588919 systemd[1]: Finished Ceph OSD losetup.
Jan 20 08:54:33 np0005588919 lvm[72649]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:54:33 np0005588919 lvm[72649]: VG ceph_vg0 finished
Jan 20 08:54:35 np0005588919 python3[72673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:56:23 np0005588919 systemd-logind[783]: New session 20 of user ceph-admin.
Jan 20 08:56:23 np0005588919 systemd[1]: Created slice User Slice of UID 42477.
Jan 20 08:56:23 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 20 08:56:23 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 20 08:56:23 np0005588919 systemd[1]: Starting User Manager for UID 42477...
Jan 20 08:56:23 np0005588919 systemd[72729]: Queued start job for default target Main User Target.
Jan 20 08:56:23 np0005588919 systemd[72729]: Created slice User Application Slice.
Jan 20 08:56:23 np0005588919 systemd[72729]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 08:56:23 np0005588919 systemd[72729]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 08:56:23 np0005588919 systemd[72729]: Reached target Paths.
Jan 20 08:56:23 np0005588919 systemd[72729]: Reached target Timers.
Jan 20 08:56:23 np0005588919 systemd[72729]: Starting D-Bus User Message Bus Socket...
Jan 20 08:56:23 np0005588919 systemd[72729]: Starting Create User's Volatile Files and Directories...
Jan 20 08:56:23 np0005588919 systemd-logind[783]: New session 22 of user ceph-admin.
Jan 20 08:56:23 np0005588919 systemd[72729]: Listening on D-Bus User Message Bus Socket.
Jan 20 08:56:23 np0005588919 systemd[72729]: Reached target Sockets.
Jan 20 08:56:23 np0005588919 systemd[72729]: Finished Create User's Volatile Files and Directories.
Jan 20 08:56:23 np0005588919 systemd[72729]: Reached target Basic System.
Jan 20 08:56:23 np0005588919 systemd[72729]: Reached target Main User Target.
Jan 20 08:56:23 np0005588919 systemd[72729]: Startup finished in 137ms.
Jan 20 08:56:23 np0005588919 systemd[1]: Started User Manager for UID 42477.
Jan 20 08:56:23 np0005588919 systemd[1]: Started Session 20 of User ceph-admin.
Jan 20 08:56:23 np0005588919 systemd[1]: Started Session 22 of User ceph-admin.
Jan 20 08:56:23 np0005588919 systemd-logind[783]: New session 23 of user ceph-admin.
Jan 20 08:56:23 np0005588919 systemd[1]: Started Session 23 of User ceph-admin.
Jan 20 08:56:24 np0005588919 systemd-logind[783]: New session 24 of user ceph-admin.
Jan 20 08:56:24 np0005588919 systemd[1]: Started Session 24 of User ceph-admin.
Jan 20 08:56:24 np0005588919 systemd-logind[783]: New session 25 of user ceph-admin.
Jan 20 08:56:24 np0005588919 systemd[1]: Started Session 25 of User ceph-admin.
Jan 20 08:56:25 np0005588919 systemd-logind[783]: New session 26 of user ceph-admin.
Jan 20 08:56:25 np0005588919 systemd[1]: Started Session 26 of User ceph-admin.
Jan 20 08:56:25 np0005588919 systemd-logind[783]: New session 27 of user ceph-admin.
Jan 20 08:56:25 np0005588919 systemd[1]: Started Session 27 of User ceph-admin.
Jan 20 08:56:26 np0005588919 systemd-logind[783]: New session 28 of user ceph-admin.
Jan 20 08:56:26 np0005588919 systemd[1]: Started Session 28 of User ceph-admin.
Jan 20 08:56:26 np0005588919 systemd-logind[783]: New session 29 of user ceph-admin.
Jan 20 08:56:26 np0005588919 systemd[1]: Started Session 29 of User ceph-admin.
Jan 20 08:56:27 np0005588919 systemd-logind[783]: New session 30 of user ceph-admin.
Jan 20 08:56:27 np0005588919 systemd[1]: Started Session 30 of User ceph-admin.
Jan 20 08:56:27 np0005588919 systemd-logind[783]: New session 31 of user ceph-admin.
Jan 20 08:56:27 np0005588919 systemd[1]: Started Session 31 of User ceph-admin.
Jan 20 08:56:28 np0005588919 systemd-logind[783]: New session 32 of user ceph-admin.
Jan 20 08:56:28 np0005588919 systemd[1]: Started Session 32 of User ceph-admin.
Jan 20 08:56:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:30 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:30 np0005588919 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73700 (sysctl)
Jan 20 08:56:30 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:30 np0005588919 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 20 08:56:31 np0005588919 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 20 08:56:31 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:32 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:32 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay-compat3630487806-merged.mount: Deactivated successfully.
Jan 20 08:56:36 np0005588919 systemd[1]: var-lib-containers-storage-overlay-compat3630487806-lower\x2dmapped.mount: Deactivated successfully.
Jan 20 08:56:51 np0005588919 podman[73978]: 2026-01-20 13:56:51.911475474 +0000 UTC m=+19.284459775 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.002921907 +0000 UTC m=+19.375906238 container create 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 20 08:56:52 np0005588919 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 20 08:56:52 np0005588919 systemd[1]: Started libpod-conmon-30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd.scope.
Jan 20 08:56:52 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.235854885 +0000 UTC m=+19.608839216 container init 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.244401668 +0000 UTC m=+19.617385959 container start 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.248496624 +0000 UTC m=+19.621480915 container attach 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True)
Jan 20 08:56:52 np0005588919 condescending_lalande[74041]: 167 167
Jan 20 08:56:52 np0005588919 systemd[1]: libpod-30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd.scope: Deactivated successfully.
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.256317146 +0000 UTC m=+19.629301457 container died 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:56:52 np0005588919 systemd[1]: var-lib-containers-storage-overlay-60584fc5fc5354075cd5220742611e57b272b541611b6de8ac2fb6005084a93d-merged.mount: Deactivated successfully.
Jan 20 08:56:52 np0005588919 podman[73978]: 2026-01-20 13:56:52.305137931 +0000 UTC m=+19.678122212 container remove 30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_lalande, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Jan 20 08:56:52 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:52 np0005588919 systemd[1]: libpod-conmon-30d6d4718878592c04ec1b37fda453fbb0549787591c936b9528b59f9558defd.scope: Deactivated successfully.
Jan 20 08:56:52 np0005588919 podman[74063]: 2026-01-20 13:56:52.47932095 +0000 UTC m=+0.053062836 container create d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:56:52 np0005588919 systemd[1]: Started libpod-conmon-d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6.scope.
Jan 20 08:56:52 np0005588919 podman[74063]: 2026-01-20 13:56:52.450459022 +0000 UTC m=+0.024200928 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:56:52 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:56:52 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7fa0c0c32eba4dd7a6d63e9ac9abe90b759712b1cc382819e6301684954ec7a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:56:52 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7fa0c0c32eba4dd7a6d63e9ac9abe90b759712b1cc382819e6301684954ec7a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:56:52 np0005588919 podman[74063]: 2026-01-20 13:56:52.624640503 +0000 UTC m=+0.198382419 container init d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:56:52 np0005588919 podman[74063]: 2026-01-20 13:56:52.637844387 +0000 UTC m=+0.211586293 container start d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 20 08:56:52 np0005588919 podman[74063]: 2026-01-20 13:56:52.643449826 +0000 UTC m=+0.217191802 container attach d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default)
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]: [
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:    {
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "available": false,
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "ceph_device": false,
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "lsm_data": {},
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "lvs": [],
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "path": "/dev/sr0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "rejected_reasons": [
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "Has a FileSystem",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "Insufficient space (<5GB)"
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        ],
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        "sys_api": {
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "actuators": null,
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "device_nodes": "sr0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "devname": "sr0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "human_readable_size": "482.00 KB",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "id_bus": "ata",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "model": "QEMU DVD-ROM",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "nr_requests": "2",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "parent": "/dev/sr0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "partitions": {},
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "path": "/dev/sr0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "removable": "1",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "rev": "2.5+",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "ro": "0",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "rotational": "1",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "sas_address": "",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "sas_device_handle": "",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "scheduler_mode": "mq-deadline",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "sectors": 0,
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "sectorsize": "2048",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "size": 493568.0,
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "support_discard": "2048",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "type": "disk",
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:            "vendor": "QEMU"
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:        }
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]:    }
Jan 20 08:56:53 np0005588919 compassionate_meninsky[74079]: ]
Jan 20 08:56:53 np0005588919 systemd[1]: libpod-d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6.scope: Deactivated successfully.
Jan 20 08:56:53 np0005588919 podman[74063]: 2026-01-20 13:56:53.806325952 +0000 UTC m=+1.380067848 container died d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 20 08:56:53 np0005588919 systemd[1]: libpod-d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6.scope: Consumed 1.176s CPU time.
Jan 20 08:56:54 np0005588919 systemd[1]: var-lib-containers-storage-overlay-f7fa0c0c32eba4dd7a6d63e9ac9abe90b759712b1cc382819e6301684954ec7a-merged.mount: Deactivated successfully.
Jan 20 08:56:54 np0005588919 podman[74063]: 2026-01-20 13:56:54.407935307 +0000 UTC m=+1.981677233 container remove d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=compassionate_meninsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef)
Jan 20 08:56:54 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:54 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:56:54 np0005588919 systemd[1]: libpod-conmon-d4f96443d4cb6bf4b71fc8b161f3dd6c3a388694adb87ae0863c6fc1b0dd75b6.scope: Deactivated successfully.
Jan 20 08:57:00 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:00 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.814137772 +0000 UTC m=+0.039004577 container create 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:00 np0005588919 systemd[1]: Started libpod-conmon-8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f.scope.
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.795433531 +0000 UTC m=+0.020300326 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:00 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.926787847 +0000 UTC m=+0.151654712 container init 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.938587622 +0000 UTC m=+0.163454427 container start 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.945642732 +0000 UTC m=+0.170509607 container attach 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:57:00 np0005588919 epic_banzai[76886]: 167 167
Jan 20 08:57:00 np0005588919 systemd[1]: libpod-8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f.scope: Deactivated successfully.
Jan 20 08:57:00 np0005588919 conmon[76886]: conmon 8c2dc29c862515b902ad <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f.scope/container/memory.events
Jan 20 08:57:00 np0005588919 podman[76870]: 2026-01-20 13:57:00.95084888 +0000 UTC m=+0.175715745 container died 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:01 np0005588919 podman[76870]: 2026-01-20 13:57:01.006372655 +0000 UTC m=+0.231239460 container remove 8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=epic_banzai, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:01 np0005588919 systemd[1]: libpod-conmon-8c2dc29c862515b902ade34205abf3f9403227ce3081e098d1e4f6bd95034c9f.scope: Deactivated successfully.
Jan 20 08:57:01 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:01 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:01 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:01 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:01 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:01 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:01 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:01 np0005588919 systemd[1]: Reached target All Ceph clusters and services.
Jan 20 08:57:01 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:01 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:01 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:01 np0005588919 systemd[1]: Reached target Ceph cluster e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:01 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:01 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:02 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:02 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:02 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:02 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:02 np0005588919 systemd[1]: Created slice Slice /system/ceph-e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:02 np0005588919 systemd[1]: Reached target System Time Set.
Jan 20 08:57:02 np0005588919 systemd[1]: Reached target System Time Synchronized.
Jan 20 08:57:02 np0005588919 systemd[1]: Starting Ceph crash.compute-1 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:57:02 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:02 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:02 np0005588919 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 20 08:57:02 np0005588919 podman[77148]: 2026-01-20 13:57:02.858648075 +0000 UTC m=+0.083158559 container create 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:02 np0005588919 podman[77148]: 2026-01-20 13:57:02.809324217 +0000 UTC m=+0.033834721 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:03 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef75fe091d0e2aed0c89236ea76d69d37c2fb011f97d328fe735d93708e99020/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:03 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef75fe091d0e2aed0c89236ea76d69d37c2fb011f97d328fe735d93708e99020/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:03 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef75fe091d0e2aed0c89236ea76d69d37c2fb011f97d328fe735d93708e99020/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:03 np0005588919 podman[77148]: 2026-01-20 13:57:03.054551823 +0000 UTC m=+0.279062377 container init 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Jan 20 08:57:03 np0005588919 podman[77148]: 2026-01-20 13:57:03.059324348 +0000 UTC m=+0.283834872 container start 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:03 np0005588919 bash[77148]: 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 20 08:57:03 np0005588919 systemd[1]: Started Ceph crash.compute-1 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.432+0000 7f203dcb6640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.432+0000 7f203dcb6640 -1 AuthRegistry(0x7f2038067440) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.434+0000 7f203dcb6640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.434+0000 7f203dcb6640 -1 AuthRegistry(0x7f203dcb5000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.436+0000 7f20377fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: 2026-01-20T13:57:03.436+0000 7f203dcb6640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 20 08:57:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1[77164]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.083143899 +0000 UTC m=+0.055192387 container create 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:04 np0005588919 systemd[1]: Started libpod-conmon-1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175.scope.
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.061599528 +0000 UTC m=+0.033648056 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.187321084 +0000 UTC m=+0.159369642 container init 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.199095738 +0000 UTC m=+0.171144246 container start 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.203233415 +0000 UTC m=+0.175281993 container attach 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:04 np0005588919 recursing_brown[77337]: 167 167
Jan 20 08:57:04 np0005588919 systemd[1]: libpod-1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175.scope: Deactivated successfully.
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.210710467 +0000 UTC m=+0.182759015 container died 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:04 np0005588919 systemd[1]: var-lib-containers-storage-overlay-b0f98fee64d0891fc249b02064888198d558c3f4adf045567084032496aa57b2-merged.mount: Deactivated successfully.
Jan 20 08:57:04 np0005588919 podman[77320]: 2026-01-20 13:57:04.270828413 +0000 UTC m=+0.242876931 container remove 1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_brown, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Jan 20 08:57:04 np0005588919 systemd[1]: libpod-conmon-1cb7c5e0d18637127d6f7b19e77ce179c1221a1668b1780d9863b302197ed175.scope: Deactivated successfully.
Jan 20 08:57:04 np0005588919 podman[77359]: 2026-01-20 13:57:04.520720921 +0000 UTC m=+0.072590810 container create 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:04 np0005588919 systemd[1]: Started libpod-conmon-6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75.scope.
Jan 20 08:57:04 np0005588919 podman[77359]: 2026-01-20 13:57:04.493732045 +0000 UTC m=+0.045601984 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:04 np0005588919 podman[77359]: 2026-01-20 13:57:04.621252373 +0000 UTC m=+0.173122302 container init 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:04 np0005588919 podman[77359]: 2026-01-20 13:57:04.632982985 +0000 UTC m=+0.184852904 container start 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:04 np0005588919 podman[77359]: 2026-01-20 13:57:04.638043919 +0000 UTC m=+0.189913898 container attach 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default)
Jan 20 08:57:05 np0005588919 elated_greider[77376]: --> passed data devices: 0 physical, 1 LVM
Jan 20 08:57:05 np0005588919 elated_greider[77376]: --> relative data size: 1.0
Jan 20 08:57:05 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 20 08:57:05 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 562c52e7-0678-4614-81fd-9a9eecf7d0f9
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 20 08:57:06 np0005588919 lvm[77424]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 08:57:06 np0005588919 lvm[77424]: VG ceph_vg0 finished
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Jan 20 08:57:06 np0005588919 elated_greider[77376]: stderr: got monmap epoch 1
Jan 20 08:57:06 np0005588919 elated_greider[77376]: --> Creating keyring file for osd.1
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Jan 20 08:57:06 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 562c52e7-0678-4614-81fd-9a9eecf7d0f9 --setuser ceph --setgroup ceph
Jan 20 08:57:09 np0005588919 elated_greider[77376]: stderr: 2026-01-20T13:57:06.791+0000 7febea47b740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:57:09 np0005588919 elated_greider[77376]: stderr: 2026-01-20T13:57:06.791+0000 7febea47b740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:57:09 np0005588919 elated_greider[77376]: stderr: 2026-01-20T13:57:06.792+0000 7febea47b740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 20 08:57:09 np0005588919 elated_greider[77376]: stderr: 2026-01-20T13:57:06.792+0000 7febea47b740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Jan 20 08:57:09 np0005588919 elated_greider[77376]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:57:09 np0005588919 elated_greider[77376]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:09 np0005588919 elated_greider[77376]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 20 08:57:09 np0005588919 elated_greider[77376]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 20 08:57:09 np0005588919 systemd[1]: libpod-6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75.scope: Deactivated successfully.
Jan 20 08:57:09 np0005588919 systemd[1]: libpod-6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75.scope: Consumed 2.767s CPU time.
Jan 20 08:57:09 np0005588919 podman[78322]: 2026-01-20 13:57:09.794923017 +0000 UTC m=+0.044674478 container died 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:10 np0005588919 systemd[1]: var-lib-containers-storage-overlay-d993914e6abbd4000a150a5a3529483fe179890d31904332e4ef83df61c4772c-merged.mount: Deactivated successfully.
Jan 20 08:57:10 np0005588919 podman[78322]: 2026-01-20 13:57:10.627044089 +0000 UTC m=+0.876795520 container remove 6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elated_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Jan 20 08:57:10 np0005588919 systemd[1]: libpod-conmon-6d23387de0c9f8bc9d9fa187afe65e88922cadb99627d9b3dcc35703f024de75.scope: Deactivated successfully.
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.348589837 +0000 UTC m=+0.031801343 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.460373618 +0000 UTC m=+0.143585084 container create c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 20 08:57:11 np0005588919 systemd[1]: Started libpod-conmon-c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17.scope.
Jan 20 08:57:11 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.572768096 +0000 UTC m=+0.255979592 container init c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.58206985 +0000 UTC m=+0.265281326 container start c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.58700845 +0000 UTC m=+0.270219946 container attach c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:11 np0005588919 vigorous_mcclintock[78495]: 167 167
Jan 20 08:57:11 np0005588919 systemd[1]: libpod-c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17.scope: Deactivated successfully.
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.592950309 +0000 UTC m=+0.276161765 container died c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 20 08:57:11 np0005588919 systemd[1]: var-lib-containers-storage-overlay-604e37ac9edb318c0a0ff38ef10fcf5056c9fc4c6df3cb95ee0b4d09d56e04db-merged.mount: Deactivated successfully.
Jan 20 08:57:11 np0005588919 podman[78479]: 2026-01-20 13:57:11.637011439 +0000 UTC m=+0.320222895 container remove c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_mcclintock, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 20 08:57:11 np0005588919 systemd[1]: libpod-conmon-c6dde6b6ef49613f50b605822100fb95d4b8a03ef036108089db487850535e17.scope: Deactivated successfully.
Jan 20 08:57:11 np0005588919 podman[78517]: 2026-01-20 13:57:11.811758826 +0000 UTC m=+0.029936771 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:12 np0005588919 podman[78517]: 2026-01-20 13:57:12.030197171 +0000 UTC m=+0.248375056 container create 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 20 08:57:12 np0005588919 systemd[1]: Started libpod-conmon-6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99.scope.
Jan 20 08:57:12 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:12 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c17f21cb9c066d9d6a2ec54488a5d28b3604242d5deb119b744d8620eed0af3d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:12 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c17f21cb9c066d9d6a2ec54488a5d28b3604242d5deb119b744d8620eed0af3d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:12 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c17f21cb9c066d9d6a2ec54488a5d28b3604242d5deb119b744d8620eed0af3d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:12 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c17f21cb9c066d9d6a2ec54488a5d28b3604242d5deb119b744d8620eed0af3d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:12 np0005588919 podman[78517]: 2026-01-20 13:57:12.347945945 +0000 UTC m=+0.566123870 container init 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:12 np0005588919 podman[78517]: 2026-01-20 13:57:12.357465015 +0000 UTC m=+0.575642870 container start 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:12 np0005588919 podman[78517]: 2026-01-20 13:57:12.3615237 +0000 UTC m=+0.579701665 container attach 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]: {
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:    "1": [
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:        {
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "devices": [
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "/dev/loop3"
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            ],
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "lv_name": "ceph_lv0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "lv_size": "7511998464",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=TDt0ds-asam-XQ1t-lT00-aV5E-HrYi-HrQkBt,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=e399cf45-e6b6-5393-99f1-75c601d3f188,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=562c52e7-0678-4614-81fd-9a9eecf7d0f9,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "lv_uuid": "TDt0ds-asam-XQ1t-lT00-aV5E-HrYi-HrQkBt",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "name": "ceph_lv0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "tags": {
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.block_uuid": "TDt0ds-asam-XQ1t-lT00-aV5E-HrYi-HrQkBt",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.cephx_lockbox_secret": "",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.cluster_fsid": "e399cf45-e6b6-5393-99f1-75c601d3f188",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.cluster_name": "ceph",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.crush_device_class": "",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.encrypted": "0",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.osd_fsid": "562c52e7-0678-4614-81fd-9a9eecf7d0f9",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.osd_id": "1",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.type": "block",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:                "ceph.vdo": "0"
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            },
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "type": "block",
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:            "vg_name": "ceph_vg0"
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:        }
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]:    ]
Jan 20 08:57:13 np0005588919 sweet_hertz[78535]: }
Jan 20 08:57:13 np0005588919 systemd[1]: libpod-6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99.scope: Deactivated successfully.
Jan 20 08:57:13 np0005588919 podman[78517]: 2026-01-20 13:57:13.116793454 +0000 UTC m=+1.334971339 container died 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 08:57:13 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c17f21cb9c066d9d6a2ec54488a5d28b3604242d5deb119b744d8620eed0af3d-merged.mount: Deactivated successfully.
Jan 20 08:57:13 np0005588919 podman[78517]: 2026-01-20 13:57:13.179611146 +0000 UTC m=+1.397788991 container remove 6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_hertz, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 20 08:57:13 np0005588919 systemd[1]: libpod-conmon-6c30a69903d5af058e342543cd24de227f8a31e771ad012f1bc40e27817ceb99.scope: Deactivated successfully.
Jan 20 08:57:13 np0005588919 podman[78697]: 2026-01-20 13:57:13.884024706 +0000 UTC m=+0.048980349 container create 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:57:13 np0005588919 systemd[1]: Started libpod-conmon-473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61.scope.
Jan 20 08:57:13 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:13 np0005588919 podman[78697]: 2026-01-20 13:57:13.86161154 +0000 UTC m=+0.026567153 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:13 np0005588919 podman[78697]: 2026-01-20 13:57:13.983475357 +0000 UTC m=+0.148430990 container init 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 20 08:57:13 np0005588919 podman[78697]: 2026-01-20 13:57:13.990902738 +0000 UTC m=+0.155858371 container start 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS)
Jan 20 08:57:13 np0005588919 confident_dewdney[78715]: 167 167
Jan 20 08:57:13 np0005588919 systemd[1]: libpod-473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61.scope: Deactivated successfully.
Jan 20 08:57:14 np0005588919 podman[78697]: 2026-01-20 13:57:14.032339963 +0000 UTC m=+0.197295596 container attach 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 20 08:57:14 np0005588919 podman[78697]: 2026-01-20 13:57:14.033718002 +0000 UTC m=+0.198673655 container died 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 20 08:57:14 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a61eed5759e507901987df3e92d86758bef62b44b1e7ad3f83617b085f258607-merged.mount: Deactivated successfully.
Jan 20 08:57:14 np0005588919 podman[78697]: 2026-01-20 13:57:14.090540454 +0000 UTC m=+0.255496067 container remove 473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=confident_dewdney, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 20 08:57:14 np0005588919 systemd[1]: libpod-conmon-473c3eda48698ea0394243f288f7faa561f940f7a3621074e5ccc42da7a27f61.scope: Deactivated successfully.
Jan 20 08:57:14 np0005588919 podman[78746]: 2026-01-20 13:57:14.456358321 +0000 UTC m=+0.048751584 container create b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:14 np0005588919 systemd[1]: Started libpod-conmon-b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e.scope.
Jan 20 08:57:14 np0005588919 podman[78746]: 2026-01-20 13:57:14.436428195 +0000 UTC m=+0.028821438 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:14 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:17 np0005588919 podman[78746]: 2026-01-20 13:57:17.409677984 +0000 UTC m=+3.002071227 container init b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 20 08:57:17 np0005588919 podman[78746]: 2026-01-20 13:57:17.422451665 +0000 UTC m=+3.014844918 container start b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 20 08:57:17 np0005588919 podman[78746]: 2026-01-20 13:57:17.43674164 +0000 UTC m=+3.029134873 container attach b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True)
Jan 20 08:57:18 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test[78762]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 20 08:57:18 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test[78762]:                            [--no-systemd] [--no-tmpfs]
Jan 20 08:57:18 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test[78762]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 20 08:57:18 np0005588919 systemd[1]: libpod-b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e.scope: Deactivated successfully.
Jan 20 08:57:18 np0005588919 podman[78746]: 2026-01-20 13:57:18.155681124 +0000 UTC m=+3.748074377 container died b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:18 np0005588919 systemd[1]: var-lib-containers-storage-overlay-222464e56bc2f745f5311a0a24d9c9fbc5ae14208298e9eb0afb18cf88b2c704-merged.mount: Deactivated successfully.
Jan 20 08:57:18 np0005588919 podman[78746]: 2026-01-20 13:57:18.230406474 +0000 UTC m=+3.822799737 container remove b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate-test, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:18 np0005588919 systemd[1]: libpod-conmon-b4d4b812a40cf0094ac25b3e0c1232f1b462f06e197f832b9e9b5e22ce23073e.scope: Deactivated successfully.
Jan 20 08:57:18 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:18 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:18 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:18 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:18 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:18 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:19 np0005588919 systemd[1]: Starting Ceph osd.1 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:57:19 np0005588919 podman[78928]: 2026-01-20 13:57:19.455055422 +0000 UTC m=+0.061352701 container create 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:19 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:19 np0005588919 podman[78928]: 2026-01-20 13:57:19.432928584 +0000 UTC m=+0.039225853 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:19 np0005588919 podman[78928]: 2026-01-20 13:57:19.542510823 +0000 UTC m=+0.148808092 container init 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:19 np0005588919 podman[78928]: 2026-01-20 13:57:19.554777731 +0000 UTC m=+0.161074970 container start 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 20 08:57:19 np0005588919 podman[78928]: 2026-01-20 13:57:19.559061852 +0000 UTC m=+0.165359111 container attach 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef)
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:20 np0005588919 bash[78928]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 20 08:57:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate[78942]: --> ceph-volume raw activate successful for osd ID: 1
Jan 20 08:57:20 np0005588919 bash[78928]: --> ceph-volume raw activate successful for osd ID: 1
Jan 20 08:57:20 np0005588919 systemd[1]: libpod-20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248.scope: Deactivated successfully.
Jan 20 08:57:20 np0005588919 podman[78928]: 2026-01-20 13:57:20.461252413 +0000 UTC m=+1.067549652 container died 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay-44589a501efb33d81eb477415101d1de97259449653627fa8e0f90e521224e0e-merged.mount: Deactivated successfully.
Jan 20 08:57:20 np0005588919 podman[78928]: 2026-01-20 13:57:20.517300173 +0000 UTC m=+1.123597412 container remove 20f36b600dacba52f75e185f79ca3e063cef685a0d176971a28152745ea1c248 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1-activate, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Jan 20 08:57:20 np0005588919 podman[79100]: 2026-01-20 13:57:20.715582798 +0000 UTC m=+0.040554462 container create 2681b7d660cda7dd317ff9dc8fefed0c116200491d86f19ae07733e29ca0fc6b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9c1ea6a997a5906003d4ec712202dea111c0476d00cc9babc7e9d63d48299cc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9c1ea6a997a5906003d4ec712202dea111c0476d00cc9babc7e9d63d48299cc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9c1ea6a997a5906003d4ec712202dea111c0476d00cc9babc7e9d63d48299cc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9c1ea6a997a5906003d4ec712202dea111c0476d00cc9babc7e9d63d48299cc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9c1ea6a997a5906003d4ec712202dea111c0476d00cc9babc7e9d63d48299cc/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:20 np0005588919 podman[79100]: 2026-01-20 13:57:20.699663996 +0000 UTC m=+0.024635680 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:20 np0005588919 podman[79100]: 2026-01-20 13:57:20.796172324 +0000 UTC m=+0.121144038 container init 2681b7d660cda7dd317ff9dc8fefed0c116200491d86f19ae07733e29ca0fc6b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:20 np0005588919 podman[79100]: 2026-01-20 13:57:20.805265962 +0000 UTC m=+0.130237636 container start 2681b7d660cda7dd317ff9dc8fefed0c116200491d86f19ae07733e29ca0fc6b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 20 08:57:20 np0005588919 bash[79100]: 2681b7d660cda7dd317ff9dc8fefed0c116200491d86f19ae07733e29ca0fc6b
Jan 20 08:57:20 np0005588919 systemd[1]: Started Ceph osd.1 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: pidfile_write: ignore empty --pid-file
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbec6b800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbec6b800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbec6b800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbec6b800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbfaad800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbfaad800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbfaad800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbfaad800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 20 08:57:20 np0005588919 ceph-osd[79119]: bdev(0x557dbfaad800 /var/lib/ceph/osd/ceph-1/block) close
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbec6b800 /var/lib/ceph/osd/ceph-1/block) close
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: load: jerasure load: lrc 
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) close
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) close
Jan 20 08:57:21 np0005588919 podman[79279]: 2026-01-20 13:57:21.650018783 +0000 UTC m=+0.036097865 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2ec00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluefs mount
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluefs mount shared_bdev_used = 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Git sha 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: DB SUMMARY
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: DB Session ID:  LTNCJ3XTV54YABYKDU5U
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                                     Options.env: 0x557dbfaffc70
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                                Options.info_log: 0x557dbece8ba0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                                 Options.wal_dir: db.wal
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.write_buffer_manager: 0x557dbfc08460
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.row_cache: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                              Options.wal_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.wal_compression: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_background_jobs: 4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Compression algorithms supported:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kZSTD supported: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 podman[79279]: 2026-01-20 13:57:21.939487264 +0000 UTC m=+0.325566266 container create 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece8600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdedd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece85c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecde430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece85c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecde430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece85c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecde430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: e16c086c-0403-48f5-8de8-0b24deda1c99
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917441958069, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917441958253, "job": 1, "event": "recovery_finished"}
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: freelist init
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: freelist _read_cfg
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bluefs umount
Jan 20 08:57:21 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) close
Jan 20 08:57:21 np0005588919 systemd[1]: Started libpod-conmon-798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2.scope.
Jan 20 08:57:22 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:22 np0005588919 podman[79279]: 2026-01-20 13:57:22.100265124 +0000 UTC m=+0.486344156 container init 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 20 08:57:22 np0005588919 podman[79279]: 2026-01-20 13:57:22.114308523 +0000 UTC m=+0.500387525 container start 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 20 08:57:22 np0005588919 podman[79279]: 2026-01-20 13:57:22.118415629 +0000 UTC m=+0.504494631 container attach 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:22 np0005588919 systemd[1]: libpod-798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2.scope: Deactivated successfully.
Jan 20 08:57:22 np0005588919 dazzling_liskov[79493]: 167 167
Jan 20 08:57:22 np0005588919 podman[79279]: 2026-01-20 13:57:22.12621046 +0000 UTC m=+0.512289492 container died 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 20 08:57:22 np0005588919 conmon[79493]: conmon 798485aa26b97fa02fd0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2.scope/container/memory.events
Jan 20 08:57:22 np0005588919 systemd[1]: var-lib-containers-storage-overlay-aa27d61ce42fb7930eae9fb413cebcc5e1a20b929b138fb96e278d428a39f94b-merged.mount: Deactivated successfully.
Jan 20 08:57:22 np0005588919 podman[79279]: 2026-01-20 13:57:22.18330659 +0000 UTC m=+0.569385622 container remove 798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_liskov, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True)
Jan 20 08:57:22 np0005588919 systemd[1]: libpod-conmon-798485aa26b97fa02fd02e0443b78d91c203b27c2c6839d74176afd14ea270f2.scope: Deactivated successfully.
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bdev(0x557dbfb2f400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluefs mount
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluefs mount shared_bdev_used = 4718592
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Git sha 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: DB SUMMARY
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: DB Session ID:  LTNCJ3XTV54YABYKDU5V
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                                     Options.env: 0x557dbed2a3f0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                                Options.info_log: 0x557dbecc5580
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                                 Options.wal_dir: db.wal
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.write_buffer_manager: 0x557dbfc08960
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.row_cache: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                              Options.wal_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.wal_compression: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_background_jobs: 4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Compression algorithms supported:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kZSTD supported: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdef30#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdf610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdf610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:           Options.merge_operator: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557dbece9100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557dbecdf610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.write_buffer_size: 16777216
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.max_write_buffer_number: 64
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.compression: LZ4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: e16c086c-0403-48f5-8de8-0b24deda1c99
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917442238257, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917442245904, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917442, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e16c086c-0403-48f5-8de8-0b24deda1c99", "db_session_id": "LTNCJ3XTV54YABYKDU5V", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917442249300, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917442, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e16c086c-0403-48f5-8de8-0b24deda1c99", "db_session_id": "LTNCJ3XTV54YABYKDU5V", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917442252771, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917442, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e16c086c-0403-48f5-8de8-0b24deda1c99", "db_session_id": "LTNCJ3XTV54YABYKDU5V", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917442254656, "job": 1, "event": "recovery_finished"}
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557dbfa74700
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: DB pointer 0x557dbfbf1a00
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 460.80 MB usag
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: _get_class not permitted to load lua
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: _get_class not permitted to load sdk
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: _get_class not permitted to load test_remote_reads
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 load_pgs
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 load_pgs opened 0 pgs
Jan 20 08:57:22 np0005588919 ceph-osd[79119]: osd.1 0 log_to_monitors true
Jan 20 08:57:22 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1[79115]: 2026-01-20T13:57:22.293+0000 7f3ee29c6740 -1 osd.1 0 log_to_monitors true
Jan 20 08:57:22 np0005588919 podman[79732]: 2026-01-20 13:57:22.388208732 +0000 UTC m=+0.047942471 container create 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 08:57:22 np0005588919 systemd[1]: Started libpod-conmon-33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143.scope.
Jan 20 08:57:22 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:22 np0005588919 podman[79732]: 2026-01-20 13:57:22.368539254 +0000 UTC m=+0.028273013 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:22 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc44f2cc50956cf4157c958606b473f20838a87b3354e637707830180568cc3f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:22 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc44f2cc50956cf4157c958606b473f20838a87b3354e637707830180568cc3f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:22 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc44f2cc50956cf4157c958606b473f20838a87b3354e637707830180568cc3f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:22 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc44f2cc50956cf4157c958606b473f20838a87b3354e637707830180568cc3f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:22 np0005588919 podman[79732]: 2026-01-20 13:57:22.479314896 +0000 UTC m=+0.139048625 container init 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:22 np0005588919 podman[79732]: 2026-01-20 13:57:22.485231064 +0000 UTC m=+0.144964783 container start 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:57:22 np0005588919 podman[79732]: 2026-01-20 13:57:22.488136767 +0000 UTC m=+0.147870516 container attach 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 20 08:57:23 np0005588919 silly_shtern[79749]: {
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:    "562c52e7-0678-4614-81fd-9a9eecf7d0f9": {
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:        "ceph_fsid": "e399cf45-e6b6-5393-99f1-75c601d3f188",
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:        "osd_id": 1,
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:        "osd_uuid": "562c52e7-0678-4614-81fd-9a9eecf7d0f9",
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:        "type": "bluestore"
Jan 20 08:57:23 np0005588919 silly_shtern[79749]:    }
Jan 20 08:57:23 np0005588919 silly_shtern[79749]: }
Jan 20 08:57:23 np0005588919 systemd[1]: libpod-33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143.scope: Deactivated successfully.
Jan 20 08:57:23 np0005588919 podman[79732]: 2026-01-20 13:57:23.4566774 +0000 UTC m=+1.116411139 container died 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3)
Jan 20 08:57:23 np0005588919 systemd[1]: var-lib-containers-storage-overlay-fc44f2cc50956cf4157c958606b473f20838a87b3354e637707830180568cc3f-merged.mount: Deactivated successfully.
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 done with init, starting boot process
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 start_boot
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 20 08:57:23 np0005588919 ceph-osd[79119]: osd.1 0  bench count 12288000 bsize 4 KiB
Jan 20 08:57:23 np0005588919 podman[79732]: 2026-01-20 13:57:23.714557565 +0000 UTC m=+1.374291284 container remove 33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_shtern, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 20 08:57:23 np0005588919 systemd[1]: libpod-conmon-33aecd00f9cdb8c73d0b4d1d314f72dd210e22b92d3d74f7dde96166c717d143.scope: Deactivated successfully.
Jan 20 08:57:25 np0005588919 podman[80002]: 2026-01-20 13:57:25.214240175 +0000 UTC m=+0.106401932 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 20 08:57:25 np0005588919 podman[80002]: 2026-01-20 13:57:25.446621212 +0000 UTC m=+0.338783049 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.314987175 +0000 UTC m=+0.068284139 container create e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.272742976 +0000 UTC m=+0.026039950 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:26 np0005588919 systemd[1]: Started libpod-conmon-e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20.scope.
Jan 20 08:57:26 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.466557794 +0000 UTC m=+0.219854748 container init e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.47468076 +0000 UTC m=+0.227977724 container start e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 20 08:57:26 np0005588919 quirky_villani[80205]: 167 167
Jan 20 08:57:26 np0005588919 systemd[1]: libpod-e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20.scope: Deactivated successfully.
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.496536322 +0000 UTC m=+0.249833276 container attach e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.497483481 +0000 UTC m=+0.250780435 container died e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507)
Jan 20 08:57:26 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e26c9cf8a570f8bf28367340bf367f599f908855fc65d27a1322f39c22c0f916-merged.mount: Deactivated successfully.
Jan 20 08:57:26 np0005588919 podman[80189]: 2026-01-20 13:57:26.634121308 +0000 UTC m=+0.387418252 container remove e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_villani, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:26 np0005588919 systemd[1]: libpod-conmon-e7f91c176f0fd720cd174799d4ab365e48a5b9f6b5e036907334a14954598b20.scope: Deactivated successfully.
Jan 20 08:57:26 np0005588919 podman[80229]: 2026-01-20 13:57:26.881270831 +0000 UTC m=+0.087651385 container create 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:26 np0005588919 podman[80229]: 2026-01-20 13:57:26.836153965 +0000 UTC m=+0.042534569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:26 np0005588919 systemd[1]: Started libpod-conmon-38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384.scope.
Jan 20 08:57:26 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:26 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9121cbe8dd5a48191ea7c7e128941aa9f7507f60ecfcce118ba8b98c01480a26/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:26 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9121cbe8dd5a48191ea7c7e128941aa9f7507f60ecfcce118ba8b98c01480a26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:26 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9121cbe8dd5a48191ea7c7e128941aa9f7507f60ecfcce118ba8b98c01480a26/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:26 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9121cbe8dd5a48191ea7c7e128941aa9f7507f60ecfcce118ba8b98c01480a26/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:27 np0005588919 podman[80229]: 2026-01-20 13:57:27.005251145 +0000 UTC m=+0.211631789 container init 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True)
Jan 20 08:57:27 np0005588919 podman[80229]: 2026-01-20 13:57:27.017328041 +0000 UTC m=+0.223708625 container start 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507)
Jan 20 08:57:27 np0005588919 podman[80229]: 2026-01-20 13:57:27.039467281 +0000 UTC m=+0.245847895 container attach 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:28 np0005588919 infallible_colden[80245]: [
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:    {
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "available": false,
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "ceph_device": false,
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "lsm_data": {},
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "lvs": [],
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "path": "/dev/sr0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "rejected_reasons": [
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "Insufficient space (<5GB)",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "Has a FileSystem"
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        ],
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        "sys_api": {
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "actuators": null,
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "device_nodes": "sr0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "devname": "sr0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "human_readable_size": "482.00 KB",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "id_bus": "ata",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "model": "QEMU DVD-ROM",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "nr_requests": "2",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "parent": "/dev/sr0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "partitions": {},
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "path": "/dev/sr0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "removable": "1",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "rev": "2.5+",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "ro": "0",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "rotational": "1",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "sas_address": "",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "sas_device_handle": "",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "scheduler_mode": "mq-deadline",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "sectors": 0,
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "sectorsize": "2048",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "size": 493568.0,
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "support_discard": "2048",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "type": "disk",
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:            "vendor": "QEMU"
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:        }
Jan 20 08:57:28 np0005588919 infallible_colden[80245]:    }
Jan 20 08:57:28 np0005588919 infallible_colden[80245]: ]
Jan 20 08:57:28 np0005588919 systemd[1]: libpod-38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384.scope: Deactivated successfully.
Jan 20 08:57:28 np0005588919 systemd[1]: libpod-38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384.scope: Consumed 1.252s CPU time.
Jan 20 08:57:28 np0005588919 conmon[80245]: conmon 38e2468afbccf13d3bb9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384.scope/container/memory.events
Jan 20 08:57:28 np0005588919 podman[80229]: 2026-01-20 13:57:28.271925748 +0000 UTC m=+1.478306282 container died 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 20 08:57:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9121cbe8dd5a48191ea7c7e128941aa9f7507f60ecfcce118ba8b98c01480a26-merged.mount: Deactivated successfully.
Jan 20 08:57:28 np0005588919 podman[80229]: 2026-01-20 13:57:28.513949166 +0000 UTC m=+1.720329690 container remove 38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_colden, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:57:28 np0005588919 systemd[1]: libpod-conmon-38e2468afbccf13d3bb90ba3cb65e5496f86df5a736b1897f6b36725d16bd384.scope: Deactivated successfully.
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 10.625 iops: 2720.012 elapsed_sec: 1.103
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [WRN] : OSD bench result of 2720.011715 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 0 waiting for initial osdmap
Jan 20 08:57:29 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1[79115]: 2026-01-20T13:57:29.878+0000 7f3edf15d640 -1 osd.1 0 waiting for initial osdmap
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 check_osdmap_features require_osd_release unknown -> reef
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 set_numa_affinity not setting numa affinity
Jan 20 08:57:29 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-osd-1[79115]: 2026-01-20T13:57:29.905+0000 7f3ed9f6e640 -1 osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 20 08:57:29 np0005588919 ceph-osd[79119]: osd.1 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 20 08:57:30 np0005588919 ceph-osd[79119]: osd.1 12 state: booting -> active
Jan 20 08:57:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 12 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=12) [1] r=0 lpr=12 pi=[10,12)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:57:31 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 13 pg[1.0( empty local-lis/les=12/13 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=12) [1] r=0 lpr=12 pi=[10,12)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:57:41 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 15 pg[2.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:57:41 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 16 pg[2.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:57:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 24 pg[7.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:57:51 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 25 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:57:52 np0005588919 podman[81542]: 2026-01-20 13:57:52.961516748 +0000 UTC m=+0.067940058 container create 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:53 np0005588919 systemd[1]: Started libpod-conmon-9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66.scope.
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:52.931388006 +0000 UTC m=+0.037811336 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:53 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:53.056897656 +0000 UTC m=+0.163320996 container init 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:53.067789056 +0000 UTC m=+0.174212346 container start 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:53.072739225 +0000 UTC m=+0.179162555 container attach 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 20 08:57:53 np0005588919 strange_germain[81559]: 167 167
Jan 20 08:57:53 np0005588919 systemd[1]: libpod-9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66.scope: Deactivated successfully.
Jan 20 08:57:53 np0005588919 conmon[81559]: conmon 9e740ffc97a6a2f8d1be <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66.scope/container/memory.events
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:53.078709086 +0000 UTC m=+0.185132386 container died 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:53 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e0651f5915327692c934c5689e2313c3d0ee64d529bdf0a53231a0316b2272c3-merged.mount: Deactivated successfully.
Jan 20 08:57:53 np0005588919 podman[81542]: 2026-01-20 13:57:53.116358606 +0000 UTC m=+0.222781906 container remove 9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_germain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:53 np0005588919 systemd[1]: libpod-conmon-9e740ffc97a6a2f8d1bea6342169cf035c9f22d01cf13ad54da1db984ea20f66.scope: Deactivated successfully.
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.211394103 +0000 UTC m=+0.056824661 container create b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:57:53 np0005588919 systemd[1]: Started libpod-conmon-b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42.scope.
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.18354353 +0000 UTC m=+0.028974118 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:53 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:57:53 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32b20a74809fa1ea554ad9b2cee83694d9e117d940cc96de9346b5a97cb5fae8/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:53 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32b20a74809fa1ea554ad9b2cee83694d9e117d940cc96de9346b5a97cb5fae8/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:53 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32b20a74809fa1ea554ad9b2cee83694d9e117d940cc96de9346b5a97cb5fae8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:53 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32b20a74809fa1ea554ad9b2cee83694d9e117d940cc96de9346b5a97cb5fae8/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.31368061 +0000 UTC m=+0.159111198 container init b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.32195537 +0000 UTC m=+0.167385928 container start b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.326000053 +0000 UTC m=+0.171430641 container attach b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:57:53 np0005588919 systemd[1]: libpod-b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42.scope: Deactivated successfully.
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.445361827 +0000 UTC m=+0.290792395 container died b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:57:53 np0005588919 systemd[1]: var-lib-containers-storage-overlay-32b20a74809fa1ea554ad9b2cee83694d9e117d940cc96de9346b5a97cb5fae8-merged.mount: Deactivated successfully.
Jan 20 08:57:53 np0005588919 podman[81578]: 2026-01-20 13:57:53.500840027 +0000 UTC m=+0.346270575 container remove b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_gauss, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:57:53 np0005588919 systemd[1]: libpod-conmon-b1154acdfe9322b30562b7514dad21020ee3d67e2ff9017f43d4c218c0cc2b42.scope: Deactivated successfully.
Jan 20 08:57:53 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:53 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:53 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:53 np0005588919 systemd[1]: Reloading.
Jan 20 08:57:53 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:57:53 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:57:54 np0005588919 systemd[1]: Starting Ceph mon.compute-1 for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:57:54 np0005588919 podman[81756]: 2026-01-20 13:57:54.361246199 +0000 UTC m=+0.065608437 container create 8b3e7cd2a573119376e59e1274d49f28e2da731dd446d02544dfe881b8843c0e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 20 08:57:54 np0005588919 podman[81756]: 2026-01-20 13:57:54.326544698 +0000 UTC m=+0.030906976 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:57:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42312ecb659a40f1b9c6a0c8ca8b6b71245feb003a67440257ddd2c64cf3289a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42312ecb659a40f1b9c6a0c8ca8b6b71245feb003a67440257ddd2c64cf3289a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42312ecb659a40f1b9c6a0c8ca8b6b71245feb003a67440257ddd2c64cf3289a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42312ecb659a40f1b9c6a0c8ca8b6b71245feb003a67440257ddd2c64cf3289a/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 20 08:57:54 np0005588919 podman[81756]: 2026-01-20 13:57:54.457475103 +0000 UTC m=+0.161837371 container init 8b3e7cd2a573119376e59e1274d49f28e2da731dd446d02544dfe881b8843c0e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:57:54 np0005588919 podman[81756]: 2026-01-20 13:57:54.468979361 +0000 UTC m=+0.173341589 container start 8b3e7cd2a573119376e59e1274d49f28e2da731dd446d02544dfe881b8843c0e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mon-compute-1, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 20 08:57:54 np0005588919 bash[81756]: 8b3e7cd2a573119376e59e1274d49f28e2da731dd446d02544dfe881b8843c0e
Jan 20 08:57:54 np0005588919 systemd[1]: Started Ceph mon.compute-1 for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: pidfile_write: ignore empty --pid-file
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: load: jerasure load: lrc 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: RocksDB version: 7.9.2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Git sha 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: DB SUMMARY
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: DB Session ID:  LFF7G2OZDOU7TKQ8MKAH
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: CURRENT file:  CURRENT
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: IDENTITY file:  IDENTITY
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                         Options.error_if_exists: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.create_if_missing: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                         Options.paranoid_checks: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                                     Options.env: 0x564d4f6bec40
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                                Options.info_log: 0x564d515aefc0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.max_file_opening_threads: 16
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                              Options.statistics: (nil)
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                               Options.use_fsync: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.max_log_file_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                         Options.allow_fallocate: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.use_direct_reads: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.create_missing_column_families: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                              Options.db_log_dir: 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                                 Options.wal_dir: 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.advise_random_on_open: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                    Options.write_buffer_manager: 0x564d515beb40
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                            Options.rate_limiter: (nil)
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.unordered_write: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                               Options.row_cache: None
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                              Options.wal_filter: None
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.allow_ingest_behind: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.two_write_queues: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.manual_wal_flush: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.wal_compression: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.atomic_flush: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.log_readahead_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.allow_data_in_errors: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.db_host_id: __hostname__
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.max_background_jobs: 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.max_background_compactions: -1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.max_subcompactions: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.max_total_wal_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                          Options.max_open_files: -1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                          Options.bytes_per_sync: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:       Options.compaction_readahead_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.max_background_flushes: -1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Compression algorithms supported:
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kZSTD supported: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kXpressCompression supported: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kBZip2Compression supported: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kLZ4Compression supported: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kZlibCompression supported: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: #011kSnappyCompression supported: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:           Options.merge_operator: 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:        Options.compaction_filter: None
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:        Options.compaction_filter_factory: None
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:  Options.sst_partitioner_factory: None
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564d515aec00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x564d515a71f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:        Options.write_buffer_size: 33554432
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:  Options.max_write_buffer_number: 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.compression: NoCompression
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:       Options.prefix_extractor: nullptr
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.num_levels: 7
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.compression_opts.level: 32767
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:               Options.compression_opts.strategy: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                  Options.compression_opts.enabled: false
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.arena_block_size: 1048576
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.disable_auto_compactions: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.inplace_update_support: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                           Options.bloom_locality: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                    Options.max_successive_merges: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.paranoid_file_checks: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.force_consistency_checks: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.report_bg_io_stats: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                               Options.ttl: 2592000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                       Options.enable_blob_files: false
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                           Options.min_blob_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                          Options.blob_file_size: 268435456
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb:                Options.blob_file_starting_level: 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1539d774-8a6f-4e48-b253-137c44586344
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917474527291, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917474529485, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917474529620, "job": 1, "event": "recovery_finished"}
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564d515d0e00
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: DB pointer 0x564d5165a000
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid e399cf45-e6b6-5393-99f1-75c601d3f188
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(???) e0 preinit fsid e399cf45-e6b6-5393-99f1-75c601d3f188
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e25 crush map has features 3314933000852226048, adjusting msgr requires
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).osd e25 crush map has features 288514051259236352, adjusting msgr requires
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3530884063' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: Updating compute-2:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.client.admin.keyring
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3880793223' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: Deploying daemon mon.compute-2 on compute-2
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3880793223' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3950308669' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3950308669' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 20 08:57:54 np0005588919 ceph-mon[81775]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 20 08:58:00 np0005588919 ceph-mon[81775]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 20 08:58:00 np0005588919 ceph-mon[81775]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 20 08:58:00 np0005588919 ceph-mon[81775]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 20 08:58:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e26 e26: 2 total, 2 up, 2 in
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e27 e27: 2 total, 2 up, 2 in
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: Deploying daemon mon.compute-1 on compute-1
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-0 calling monitor election
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3099254653' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-2 calling monitor election
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: Health detail: HEALTH_WARN 6 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: [WRN] POOL_APP_NOT_ENABLED: 6 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'vms'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'volumes'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'backups'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'images'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'cephfs.cephfs.meta'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'cephfs.cephfs.data'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3099254653' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: Deploying daemon mgr.compute-2.gunjko on compute-2
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-20T13:57:53.383953Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,os=Linux}
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e28 e28: 2 total, 2 up, 2 in
Jan 20 08:58:03 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 28 pg[2.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=10.162316322s) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active pruub 51.637405396s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:03 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 28 pg[2.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=10.162316322s) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown pruub 51.637405396s@ mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1076842494' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-0 calling monitor election
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-2 calling monitor election
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-1 calling monitor election
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: Health detail: HEALTH_WARN 5 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: [WRN] POOL_APP_NOT_ENABLED: 5 pool(s) do not have an application enabled
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'volumes'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'backups'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'images'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'cephfs.cephfs.meta'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    application not enabled on pool 'cephfs.cephfs.data'
Jan 20 08:58:03 np0005588919 ceph-mon[81775]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e28 _set_new_cache_sizes cache_size:1019933228 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.569312076 +0000 UTC m=+0.069375481 container create 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 20 08:58:04 np0005588919 systemd[1]: Started libpod-conmon-8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155.scope.
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.537758341 +0000 UTC m=+0.037821816 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.671722657 +0000 UTC m=+0.171786142 container init 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.682098341 +0000 UTC m=+0.182161746 container start 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.685645109 +0000 UTC m=+0.185708534 container attach 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 20 08:58:04 np0005588919 hardcore_williams[81970]: 167 167
Jan 20 08:58:04 np0005588919 systemd[1]: libpod-8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155.scope: Deactivated successfully.
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.692833966 +0000 UTC m=+0.192897391 container died 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:04 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ee23155c83f2fddce7465f76b73454ad9f3989d7342da40e2b42fc3ae8ea0388-merged.mount: Deactivated successfully.
Jan 20 08:58:04 np0005588919 podman[81954]: 2026-01-20 13:58:04.737730856 +0000 UTC m=+0.237794291 container remove 8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hardcore_williams, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:04 np0005588919 systemd[1]: libpod-conmon-8443aa08b83ff9cad5d88a8d4b8aabe85b3dd1fa166d0cf5cc56c6c26726d155.scope: Deactivated successfully.
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e29 e29: 2 total, 2 up, 2 in
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1f( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1b( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1e( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.a( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.9( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.8( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.7( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1d( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.6( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.4( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.2( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.5( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.3( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.b( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.c( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.d( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.e( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.f( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.10( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.12( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.11( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.13( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1c( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.16( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.14( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.15( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.17( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.18( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.19( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1a( empty local-lis/les=15/16 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.a( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.9( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.8( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.7( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.4( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.6( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.5( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.3( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.2( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.0( empty local-lis/les=28/29 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.10( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.13( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.12( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.16( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.15( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.14( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.17( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.19( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.1a( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.11( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 29 pg[2.18( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=15/15 les/c/f=16/16/0 sis=28) [1] r=0 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1076842494' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.oweoeg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.oweoeg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: Deploying daemon mgr.compute-1.oweoeg on compute-1
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:04 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:04 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:04 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:05 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:05 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:05 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:05 np0005588919 systemd[1]: Starting Ceph mgr.compute-1.oweoeg for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:05 np0005588919 podman[82116]: 2026-01-20 13:58:05.685942607 +0000 UTC m=+0.063353730 container create 04b4edd0953680a3226067bf6924e194df45e64acbb7953c52825a4a8b3eb2ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 20 08:58:05 np0005588919 podman[82116]: 2026-01-20 13:58:05.654459613 +0000 UTC m=+0.031870826 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0da49ae4e8627d0dacb828dd5a41fd0ab0e7ac1613d0580fb5e813bc96d1ddc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0da49ae4e8627d0dacb828dd5a41fd0ab0e7ac1613d0580fb5e813bc96d1ddc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0da49ae4e8627d0dacb828dd5a41fd0ab0e7ac1613d0580fb5e813bc96d1ddc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0da49ae4e8627d0dacb828dd5a41fd0ab0e7ac1613d0580fb5e813bc96d1ddc/merged/var/lib/ceph/mgr/ceph-compute-1.oweoeg supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e30 e30: 2 total, 2 up, 2 in
Jan 20 08:58:05 np0005588919 podman[82116]: 2026-01-20 13:58:05.775065195 +0000 UTC m=+0.152476408 container init 04b4edd0953680a3226067bf6924e194df45e64acbb7953c52825a4a8b3eb2ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:05 np0005588919 podman[82116]: 2026-01-20 13:58:05.783166951 +0000 UTC m=+0.160578104 container start 04b4edd0953680a3226067bf6924e194df45e64acbb7953c52825a4a8b3eb2ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:05 np0005588919 bash[82116]: 04b4edd0953680a3226067bf6924e194df45e64acbb7953c52825a4a8b3eb2ba
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1913464166' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1913464166' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:58:05 np0005588919 systemd[1]: Started Ceph mgr.compute-1.oweoeg for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:05 np0005588919 ceph-mgr[82135]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:05 np0005588919 ceph-mgr[82135]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 20 08:58:05 np0005588919 ceph-mgr[82135]: pidfile_write: ignore empty --pid-file
Jan 20 08:58:05 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'alerts'
Jan 20 08:58:06 np0005588919 ceph-mgr[82135]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 20 08:58:06 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'balancer'
Jan 20 08:58:06 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:06.326+0000 7fae9b308140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 20 08:58:06 np0005588919 ceph-mgr[82135]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 20 08:58:06 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'cephadm'
Jan 20 08:58:06 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:06.562+0000 7fae9b308140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 20 08:58:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e31 e31: 2 total, 2 up, 2 in
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:58:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: Deploying daemon crash.compute-2 on compute-2
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4079761379' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4079761379' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:58:08 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'crash'
Jan 20 08:58:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 20 08:58:09 np0005588919 ceph-mgr[82135]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 20 08:58:09 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'dashboard'
Jan 20 08:58:09 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:09.033+0000 7fae9b308140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020053189 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 32 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=32 pruub=13.407303810s) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active pruub 60.937648773s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=32 pruub=13.407303810s) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown pruub 60.937648773s@ mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.c( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.d( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.e( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.f( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.12( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.13( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.10( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.11( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.16( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.17( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.14( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.15( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1a( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.18( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1b( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.19( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1c( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1d( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1e( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1f( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.2( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.3( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.6( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.7( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.4( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.5( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.8( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.a( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.9( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.b( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 33 pg[7.1( empty local-lis/les=24/25 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:10 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3413961177' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 20 08:58:10 np0005588919 ceph-mon[81775]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 20 08:58:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1c( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1d( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.12( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.11( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.10( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.16( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.17( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.14( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.8( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.15( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.9( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.6( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.a( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.5( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.4( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.7( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.d( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.0( empty local-lis/les=32/34 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.2( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.13( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.c( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.19( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.1a( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.3( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 34 pg[7.18( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=24/24 les/c/f=25/25/0 sis=32) [1] r=0 lpr=32 pi=[24,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e35 e35: 3 total, 2 up, 3 in
Jan 20 08:58:10 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'devicehealth'
Jan 20 08:58:10 np0005588919 ceph-mgr[82135]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 20 08:58:10 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'diskprediction_local'
Jan 20 08:58:10 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:10.699+0000 7fae9b308140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]:  from numpy import show_config as show_numpy_config
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:11.241+0000 7fae9b308140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'influx'
Jan 20 08:58:11 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3413961177' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 20 08:58:11 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/3257799028' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]: dispatch
Jan 20 08:58:11 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]: dispatch
Jan 20 08:58:11 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "36aff1f5-bc44-4633-b417-95c5b1ee6391"}]': finished
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'insights'
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:11.490+0000 7fae9b308140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'iostat'
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:11.972+0000 7fae9b308140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 20 08:58:11 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'k8sevents'
Jan 20 08:58:12 np0005588919 ceph-mon[81775]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 20 08:58:12 np0005588919 ceph-mon[81775]: Cluster is now healthy
Jan 20 08:58:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Jan 20 08:58:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Jan 20 08:58:13 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'localpool'
Jan 20 08:58:13 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'mds_autoscaler'
Jan 20 08:58:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e35 _set_new_cache_sizes cache_size:1020054712 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:14 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'mirroring'
Jan 20 08:58:14 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'nfs'
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1467956015' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/1467956015' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Jan 20 08:58:15 np0005588919 ceph-mgr[82135]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'orchestrator'
Jan 20 08:58:15 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:15.635+0000 7fae9b308140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 20 08:58:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e36 e36: 3 total, 2 up, 3 in
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1d( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.625294685s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.022346497s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1d( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.625168800s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.022346497s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.13( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.626079559s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023300171s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.10( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.625190735s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.022499084s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.13( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.625969887s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023300171s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.10( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.625130653s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.022499084s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.14( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624791145s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.022750854s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.14( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624705315s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.022750854s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.13( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.096986771s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.494857788s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.10( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.096913338s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.495040894s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.10( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.096845627s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.495040894s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.13( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.096667290s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.494857788s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.a( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624776840s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023109436s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.a( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624733925s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023109436s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.095741272s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.494155884s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.095705986s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.494155884s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624183655s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.022796631s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624153137s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.022796631s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.095242500s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.494079590s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.095215797s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.494079590s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.8( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.624043465s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023010254s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.095067978s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.494064331s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.8( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623994827s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023010254s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.9( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623971939s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023063660s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.094990730s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.494064331s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.9( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623941422s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023063660s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623802185s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023086548s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623771667s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023086548s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.6( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623663902s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023101807s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.6( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623616219s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023101807s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093919754s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.493621826s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.4( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623414993s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023139954s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093876839s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.493621826s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.4( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623368263s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023139954s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.4( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093567848s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.493431091s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.4( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093485832s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.493431091s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.6( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093406677s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.493484497s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.3( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623608589s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023704529s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.3( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.623566628s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023704529s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.6( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.093358994s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.493484497s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.2( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622950554s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023246765s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.2( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622920990s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023246765s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.9( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.092387199s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.492820740s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.9( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.092342377s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.492820740s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.a( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.086156845s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.486694336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622728348s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023292542s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.a( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.086091995s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.486694336s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622667313s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023292542s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622604370s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023323059s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1e( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622563362s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023323059s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.18( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622730255s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023712158s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.18( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622694969s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023712158s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622620583s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active pruub 64.023681641s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[7.1b( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=36 pruub=10.622578621s) [0] r=-1 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.023681641s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.085520744s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.486717224s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1e( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.085493088s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.486717224s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.085291862s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.486648560s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.085250854s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.486648560s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.084542274s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.486602783s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.1b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.084483147s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.486602783s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.15( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.091486931s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.495178223s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.15( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.091196060s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.495178223s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.19( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.091099739s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active pruub 66.495681763s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[2.19( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=36 pruub=13.090907097s) [0] r=-1 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.495681763s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.1a( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.1a( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.18( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.1c( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.1b( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.1a( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.1b( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.19( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.18( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.1a( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.1e( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.1c( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.e( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.9( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.f( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.d( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.2( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.3( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.5( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.4( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.7( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.7( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.e( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.5( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.3( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.1( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.2( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.1( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.d( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.1d( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.5( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.c( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.e( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.a( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.d( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.8( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.a( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.9( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.f( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.8( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.c( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.e( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.a( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.10( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.9( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.15( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.11( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.15( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.13( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.14( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.15( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.13( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.17( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.16( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.16( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.10( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[3.15( empty local-lis/les=0/0 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[4.1f( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.11( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.1c( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[6.12( empty local-lis/les=0/0 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 36 pg[5.1f( empty local-lis/les=0/0 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:16 np0005588919 ceph-mgr[82135]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'osd_perf_query'
Jan 20 08:58:16 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:16.454+0000 7fae9b308140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/3060958510' entity='client.admin' 
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:58:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e37 e37: 3 total, 2 up, 3 in
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.1e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.1f( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.1c( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.10( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.16( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.11( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.14( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.13( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.15( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.12( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.15( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.15( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.13( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.16( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.10( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.11( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.1f( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.e( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.9( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.9( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.f( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.a( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.8( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.c( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.a( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.8( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.7( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.5( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.15( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.4( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.d( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.7( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.2( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.5( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.3( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.2( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.1( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.5( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.1( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.3( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.f( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.e( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.9( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.e( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.d( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.c( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.1c( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.1d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.1b( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.1a( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.1b( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.19( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.1a( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.1c( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[4.18( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.1a( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[3.1a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=28/28 les/c/f=29/29/0 sis=36) [1] r=0 lpr=36 pi=[28,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[5.18( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=30/30 les/c/f=31/31/0 sis=36) [1] r=0 lpr=36 pi=[30,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 37 pg[6.17( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=32/32 les/c/f=33/33/0 sis=36) [1] r=0 lpr=36 pi=[32,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:16 np0005588919 ceph-mgr[82135]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 20 08:58:16 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'osd_support'
Jan 20 08:58:16 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:16.747+0000 7fae9b308140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'pg_autoscaler'
Jan 20 08:58:17 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:17.023+0000 7fae9b308140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'progress'
Jan 20 08:58:17 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:17.319+0000 7fae9b308140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'prometheus'
Jan 20 08:58:17 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:17.572+0000 7fae9b308140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 20 08:58:17 np0005588919 ceph-mon[81775]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:17 np0005588919 ceph-mon[81775]: Saving service ingress.rgw.default spec with placement count:2
Jan 20 08:58:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 20 08:58:18 np0005588919 ceph-mgr[82135]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 20 08:58:18 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'rbd_support'
Jan 20 08:58:18 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:18.661+0000 7fae9b308140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 20 08:58:18 np0005588919 ceph-mon[81775]: Deploying daemon osd.2 on compute-2
Jan 20 08:58:18 np0005588919 ceph-mgr[82135]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 20 08:58:18 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'restful'
Jan 20 08:58:18 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:18.980+0000 7fae9b308140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 20 08:58:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Jan 20 08:58:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e2 new map
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:19.644841+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e38 e38: 3 total, 2 up, 3 in
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 20 08:58:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:19 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'rgw'
Jan 20 08:58:20 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 20 08:58:20 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 20 08:58:20 np0005588919 ceph-mgr[82135]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 20 08:58:20 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'rook'
Jan 20 08:58:20 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:20.479+0000 7fae9b308140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 20 08:58:20 np0005588919 ceph-mon[81775]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:21 np0005588919 ceph-mon[81775]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:22 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Jan 20 08:58:22 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Jan 20 08:58:22 np0005588919 ceph-mgr[82135]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'selftest'
Jan 20 08:58:22 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:22.633+0000 7fae9b308140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:22 np0005588919 ceph-mgr[82135]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:22.897+0000 7fae9b308140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 20 08:58:22 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'snap_schedule'
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:23.189+0000 7fae9b308140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'stats'
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'status'
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:23.728+0000 7fae9b308140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'telegraf'
Jan 20 08:58:23 np0005588919 ceph-mon[81775]: from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 20 08:58:23 np0005588919 ceph-mon[81775]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 20 08:58:23 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/2618177133' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 20 08:58:23 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/2618177133' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 20 08:58:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 20 08:58:23 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'telemetry'
Jan 20 08:58:23 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:23.976+0000 7fae9b308140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:24 np0005588919 ceph-mgr[82135]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'test_orchestrator'
Jan 20 08:58:24 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:24.599+0000 7fae9b308140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: from='osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:25 np0005588919 ceph-mgr[82135]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:25 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'volumes'
Jan 20 08:58:25 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:25.328+0000 7fae9b308140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 20 08:58:25 np0005588919 podman[82395]: 2026-01-20 13:58:25.4329378 +0000 UTC m=+0.110084494 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 20 08:58:25 np0005588919 podman[82395]: 2026-01-20 13:58:25.536360521 +0000 UTC m=+0.213507195 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 20 08:58:25 np0005588919 ceph-mon[81775]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.1e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.579854012s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.409347534s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.1f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.188198090s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active pruub 72.017723083s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.1c( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583823204s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413360596s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.1e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.579854012s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.409347534s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.1c( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583823204s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413360596s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.1f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.188198090s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.017723083s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.12( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583716393s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413505554s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.12( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583716393s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413505554s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583548546s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413490295s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.15( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583548546s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413490295s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.11( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192520142s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active pruub 72.022560120s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.17( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.586153030s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.416221619s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.11( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192520142s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022560120s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[6.17( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.586153030s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.416221619s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.15( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583463669s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413619995s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.16( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192604065s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active pruub 72.022766113s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.16( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192604065s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022766113s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.15( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583463669s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413619995s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.12( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.664838791s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.495071411s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.12( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.664838791s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495071411s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583417892s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413772583s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.e( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583417892s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413772583s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.9( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583403587s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413810730s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.9( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583403587s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413810730s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.18( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.665678978s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.496124268s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.18( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.665678978s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.496124268s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.664505005s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.494987488s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.664505005s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494987488s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.8( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583350182s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413917542s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.8( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583350182s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413917542s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583116531s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413742065s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.11( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583116531s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413742065s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.663282394s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.494148254s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.663282394s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494148254s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.5( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192474365s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active pruub 72.023422241s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.4( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583220482s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414077759s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[7.5( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=40 pruub=8.192474365s) [] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.023422241s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.4( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583220482s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414077759s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.5( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.662651062s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.493736267s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.5( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.662651062s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583081245s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414337158s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.1( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583039284s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414276123s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.9( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583081245s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414337158s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.1( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583039284s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414276123s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.e( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583056450s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414367676s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.e( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583056450s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414367676s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.1f( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.581921577s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.413330078s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.1c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.663998604s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.495536804s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583057404s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414588928s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.1a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.583057404s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414588928s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.1c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.663998604s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495536804s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.1a( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.582934380s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414543152s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[5.1a( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.582934380s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414543152s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.1d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.662040710s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active pruub 74.493736267s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.582720757s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 78.414451599s@ mbc={}] start_peering_interval up [1] -> [], acting [1] -> [], acting_primary 1 -> -1, up_primary 1 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[2.1d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=40 pruub=10.662040710s) [] r=-1 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[4.1f( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.581921577s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413330078s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 40 pg[3.1d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=14.582720757s) [] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414451599s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:26 np0005588919 ceph-mgr[82135]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 20 08:58:26 np0005588919 ceph-mgr[82135]: mgr[py] Loading python module 'zabbix'
Jan 20 08:58:26 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:26.129+0000 7fae9b308140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 20 08:58:26 np0005588919 ceph-mgr[82135]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 20 08:58:26 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mgr-compute-1-oweoeg[82131]: 2026-01-20T13:58:26.374+0000 7fae9b308140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 20 08:58:26 np0005588919 ceph-mgr[82135]: ms_deliver_dispatch: unhandled message 0x559e86979600 mon_map magic: 0 v1 from mon.2 v2:192.168.122.101:3300/0
Jan 20 08:58:26 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 08:58:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:28 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/260003973' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 20 08:58:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Jan 20 08:58:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: Unable to set osd_memory_target on compute-2 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: Updating compute-0:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: Updating compute-1:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: Updating compute-2:/etc/ceph/ceph.conf
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: OSD bench result of 6249.450009 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 20 08:58:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 20 08:58:30 np0005588919 ceph-mon[81775]: osd.2 [v2:192.168.122.102:6800/3188109873,v1:192.168.122.102:6801/3188109873] boot
Jan 20 08:58:30 np0005588919 ceph-mon[81775]: Updating compute-2:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588919 ceph-mon[81775]: Updating compute-0:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588919 ceph-mon[81775]: Updating compute-1:/var/lib/ceph/e399cf45-e6b6-5393-99f1-75c601d3f188/config/ceph.conf
Jan 20 08:58:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[6.1e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.980747223s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.409347534s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[6.1e( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.980671883s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.409347534s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[4.1f( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984628677s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413330078s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[4.1f( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984438896s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413330078s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[6.1c( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984391212s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413360596s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[6.1c( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984327316s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413360596s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.18( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.066934586s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.496124268s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.18( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.066905975s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.496124268s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[6.12( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984229088s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413505554s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[6.12( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984197617s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413505554s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[6.17( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.986830711s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.416221619s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[7.11( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.593162298s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022560120s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.15( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984086037s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413490295s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[6.17( empty local-lis/les=36/37 n=0 ec=32/22 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.986802101s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.416221619s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984040260s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413490295s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.593111753s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022560120s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[4.15( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984107971s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413619995s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[4.15( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.984050751s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413619995s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[7.16( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.593186617s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022766113s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[7.16( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.593151093s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.022766113s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.588076591s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.017723083s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.12( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.065394878s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495071411s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.588032246s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.017723083s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.12( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.065350533s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495071411s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.11( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983944893s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413742065s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.e( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983512878s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413772583s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983522415s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413810730s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983480453s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413772583s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983432770s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413742065s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983484268s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413810730s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983489990s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413917542s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983444214s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.413917542s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[5.4( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983516693s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414077759s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.064367294s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494987488s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.063500404s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494148254s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[5.4( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983321190s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414077759s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.592596769s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.023422241s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=32/34 n=0 ec=32/24 lis/c=32/32 les/c/f=34/34/0 sis=41 pruub=3.592567205s) [2] r=-1 lpr=41 pi=[32,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 72.023422241s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.b( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.063432217s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494148254s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.062843323s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.062771320s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.f( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.064172745s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.494987488s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[4.1( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.983118057s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414276123s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[4.1( empty local-lis/les=36/37 n=0 ec=30/18 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982706070s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414276123s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.1a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982971191s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414588928s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982712746s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414337158s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.1a( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982943535s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414588928s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982770920s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414451599s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982671738s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414337158s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[5.e( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982653618s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414367676s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=36/37 n=0 ec=28/17 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982739449s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414451599s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[5.e( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982616425s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414367676s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.1c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.063718319s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495536804s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.1c( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.063689232s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.495536804s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982590675s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414543152s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 41 pg[2.1d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.061774254s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=36/37 n=0 ec=30/20 lis/c=36/36 les/c/f=37/37/0 sis=41 pruub=9.982562065s) [2] r=-1 lpr=41 pi=[36,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 78.414543152s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 42 pg[2.1d( empty local-lis/les=28/29 n=0 ec=28/15 lis/c=28/28 les/c/f=29/29/0 sis=41 pruub=6.061741352s) [2] r=-1 lpr=41 pi=[28,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 74.493736267s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:58:32 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 20 08:58:32 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 20 08:58:33 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 20 08:58:33 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 20 08:58:34 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 20 08:58:34 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 20 08:58:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 20 08:58:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 20 08:58:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ktpnzt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ktpnzt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:38 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Jan 20 08:58:38 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Jan 20 08:58:38 np0005588919 ceph-mon[81775]: Deploying daemon rgw.rgw.compute-2.ktpnzt on compute-2
Jan 20 08:58:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:39 np0005588919 podman[83604]: 2026-01-20 13:58:39.97117181 +0000 UTC m=+0.067810801 container create 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:39 np0005588919 systemd[72729]: Starting Mark boot as successful...
Jan 20 08:58:39 np0005588919 systemd[72729]: Finished Mark boot as successful.
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:39.936239968 +0000 UTC m=+0.032879019 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:40 np0005588919 systemd[1]: Started libpod-conmon-88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563.scope.
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.orkqpg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.orkqpg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: Deploying daemon rgw.rgw.compute-1.orkqpg on compute-1
Jan 20 08:58:40 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:58:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:40.113777096 +0000 UTC m=+0.210416107 container init 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:40.126799943 +0000 UTC m=+0.223438914 container start 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:40.131518421 +0000 UTC m=+0.228157392 container attach 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 20 08:58:40 np0005588919 gifted_mclaren[83621]: 167 167
Jan 20 08:58:40 np0005588919 systemd[1]: libpod-88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563.scope: Deactivated successfully.
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:40.135085742 +0000 UTC m=+0.231724713 container died 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 08:58:40 np0005588919 systemd[1]: var-lib-containers-storage-overlay-78e46cce27f96a9c69d7e33c1c56d4dc8b71cd0755f4a8a844e5a3d399c27cff-merged.mount: Deactivated successfully.
Jan 20 08:58:40 np0005588919 podman[83604]: 2026-01-20 13:58:40.176730594 +0000 UTC m=+0.273369595 container remove 88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gifted_mclaren, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 20 08:58:40 np0005588919 systemd[1]: libpod-conmon-88a0bb343253b352590ef93cabf3d6a81fa582a2485db1d96ab8e47f9b7f0563.scope: Deactivated successfully.
Jan 20 08:58:40 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:40 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:40 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:40 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:40 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:40 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:40 np0005588919 systemd[1]: Starting Ceph rgw.rgw.compute-1.orkqpg for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 20 08:58:41 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 20 08:58:41 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 20 08:58:41 np0005588919 podman[83766]: 2026-01-20 13:58:41.208203968 +0000 UTC m=+0.055077522 container create b3823476695d3583b002dcb0048f7c2e7afdb2953c94118039f1275878a4001e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-1-orkqpg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d79394fd7e44d8a4966f50cadc4a3278e0964309fc67e4f603ebf7a9e34025af/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d79394fd7e44d8a4966f50cadc4a3278e0964309fc67e4f603ebf7a9e34025af/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d79394fd7e44d8a4966f50cadc4a3278e0964309fc67e4f603ebf7a9e34025af/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d79394fd7e44d8a4966f50cadc4a3278e0964309fc67e4f603ebf7a9e34025af/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.orkqpg supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:41 np0005588919 podman[83766]: 2026-01-20 13:58:41.182307219 +0000 UTC m=+0.029180843 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:41 np0005588919 podman[83766]: 2026-01-20 13:58:41.29850598 +0000 UTC m=+0.145379604 container init b3823476695d3583b002dcb0048f7c2e7afdb2953c94118039f1275878a4001e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-1-orkqpg, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:58:41 np0005588919 podman[83766]: 2026-01-20 13:58:41.305415036 +0000 UTC m=+0.152288620 container start b3823476695d3583b002dcb0048f7c2e7afdb2953c94118039f1275878a4001e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-1-orkqpg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:58:41 np0005588919 bash[83766]: b3823476695d3583b002dcb0048f7c2e7afdb2953c94118039f1275878a4001e
Jan 20 08:58:41 np0005588919 systemd[1]: Started Ceph rgw.rgw.compute-1.orkqpg for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:41 np0005588919 radosgw[83787]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:41 np0005588919 radosgw[83787]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 20 08:58:41 np0005588919 radosgw[83787]: framework: beast
Jan 20 08:58:41 np0005588919 radosgw[83787]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 20 08:58:41 np0005588919 radosgw[83787]: init_numa not setting numa affinity
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kiggjh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.kiggjh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:42 np0005588919 ceph-mon[81775]: Deploying daemon rgw.rgw.compute-0.kiggjh on compute-0
Jan 20 08:58:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 20 08:58:43 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:43 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.jyxktq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.jyxktq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: Deploying daemon mds.cephfs.compute-2.jyxktq on compute-2
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:44 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 47 pg[10.0( empty local-lis/les=0/0 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [1] r=0 lpr=47 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:58:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 20 08:58:45 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 48 pg[10.0( empty local-lis/les=47/48 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [1] r=0 lpr=47 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.101:0/2347323994' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/418792044' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/1247667946' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/418792044' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:45 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e3 new map
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:19.644841+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.jyxktq{-1:24178} state up:standby seq 1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e4 new map
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:46.558090+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:creating seq 1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.znrafi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.znrafi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: Deploying daemon mds.cephfs.compute-0.znrafi on compute-0
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:46 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: daemon mds.cephfs.compute-2.jyxktq assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: Cluster is now healthy
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: daemon mds.cephfs.compute-2.jyxktq is now active in filesystem cephfs as rank 0
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.101:0/1523026806' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.102:0/159360274' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rtofcx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rtofcx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e5 new map
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:47.570199+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 2 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e6 new map
Jan 20 08:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:47.570199+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 2 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 20 08:58:48 np0005588919 radosgw[83787]: LDAP not started since no server URIs were provided in the configuration.
Jan 20 08:58:48 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-rgw-rgw-compute-1-orkqpg[83783]: 2026-01-20T13:58:48.418+0000 7f0af41b1940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 20 08:58:48 np0005588919 radosgw[83787]: framework: beast
Jan 20 08:58:48 np0005588919 radosgw[83787]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 20 08:58:48 np0005588919 radosgw[83787]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: starting handler: beast
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.455957745 +0000 UTC m=+0.080968901 container create 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 20 08:58:48 np0005588919 radosgw[83787]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: mgrc service_daemon_register rgw.24128 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.orkqpg,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=8115d0e5-f46a-4d23-887b-99af6a666d4f,zone_name=default,zonegroup_id=1c9817d6-3061-4a20-aeb7-2a830f7cf40e,zonegroup_name=default}
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 systemd[1]: Started libpod-conmon-695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d.scope.
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.419353271 +0000 UTC m=+0.044364477 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 20 08:58:48 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.574391726 +0000 UTC m=+0.199402902 container init 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True)
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.5895366 +0000 UTC m=+0.214547766 container start 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.594051611 +0000 UTC m=+0.219062777 container attach 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:58:48 np0005588919 zen_keldysh[84557]: 167 167
Jan 20 08:58:48 np0005588919 systemd[1]: libpod-695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d.scope: Deactivated successfully.
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.600544984 +0000 UTC m=+0.225556150 container died 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:48 np0005588919 ceph-mon[81775]: Deploying daemon mds.cephfs.compute-1.rtofcx on compute-1
Jan 20 08:58:48 np0005588919 ceph-mon[81775]: from='client.? 192.168.122.100:0/4243338850' entity='client.rgw.rgw.compute-0.kiggjh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-1.orkqpg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588919 ceph-mon[81775]: from='client.? ' entity='client.rgw.rgw.compute-2.ktpnzt' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 20 08:58:48 np0005588919 systemd[1]: var-lib-containers-storage-overlay-1aa779a001d507856a640fdfd056952e1868d52dce81f39ac9b68d03a077e8c7-merged.mount: Deactivated successfully.
Jan 20 08:58:48 np0005588919 podman[83997]: 2026-01-20 13:58:48.655025826 +0000 UTC m=+0.280036992 container remove 695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_keldysh, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Jan 20 08:58:48 np0005588919 systemd[1]: libpod-conmon-695f2e5816d99eb87c13bf413a88ea9a65701e362cdc9e838c1e779764f1af4d.scope: Deactivated successfully.
Jan 20 08:58:48 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:48 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:48 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:49 np0005588919 systemd[1]: Reloading.
Jan 20 08:58:49 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 08:58:49 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 08:58:49 np0005588919 systemd[1]: Starting Ceph mds.cephfs.compute-1.rtofcx for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 08:58:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 20 08:58:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 20 08:58:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:49 np0005588919 podman[84702]: 2026-01-20 13:58:49.712299477 +0000 UTC m=+0.051064287 container create 8be169de4b9d5b0033f68969bfb5e00cda2b48e98cf27080327a9a7ac12e432d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-1-rtofcx, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f8919bc234ca92c0d89cecc7bd66f4a0180e484386127483a8a6cce4b24675/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f8919bc234ca92c0d89cecc7bd66f4a0180e484386127483a8a6cce4b24675/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f8919bc234ca92c0d89cecc7bd66f4a0180e484386127483a8a6cce4b24675/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f8919bc234ca92c0d89cecc7bd66f4a0180e484386127483a8a6cce4b24675/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.rtofcx supports timestamps until 2038 (0x7fffffff)
Jan 20 08:58:49 np0005588919 podman[84702]: 2026-01-20 13:58:49.686336276 +0000 UTC m=+0.025101086 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:58:49 np0005588919 podman[84702]: 2026-01-20 13:58:49.784004598 +0000 UTC m=+0.122769418 container init 8be169de4b9d5b0033f68969bfb5e00cda2b48e98cf27080327a9a7ac12e432d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-1-rtofcx, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:58:49 np0005588919 podman[84702]: 2026-01-20 13:58:49.793067451 +0000 UTC m=+0.131832221 container start 8be169de4b9d5b0033f68969bfb5e00cda2b48e98cf27080327a9a7ac12e432d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-1-rtofcx, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:58:49 np0005588919 bash[84702]: 8be169de4b9d5b0033f68969bfb5e00cda2b48e98cf27080327a9a7ac12e432d
Jan 20 08:58:49 np0005588919 systemd[1]: Started Ceph mds.cephfs.compute-1.rtofcx for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 08:58:49 np0005588919 ceph-mds[84722]: set uid:gid to 167:167 (ceph:ceph)
Jan 20 08:58:49 np0005588919 ceph-mds[84722]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 20 08:58:49 np0005588919 ceph-mds[84722]: main not setting numa affinity
Jan 20 08:58:49 np0005588919 ceph-mds[84722]: pidfile_write: ignore empty --pid-file
Jan 20 08:58:49 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-mds-cephfs-compute-1-rtofcx[84718]: starting mds.cephfs.compute-1.rtofcx at 
Jan 20 08:58:49 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Updating MDS map to version 6 from mon.2
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e7 new map
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:51 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Updating MDS map to version 7 from mon.2
Jan 20 08:58:51 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Monitors have assigned me to become a standby.
Jan 20 08:58:51 np0005588919 ceph-mon[81775]: Deploying daemon haproxy.rgw.default.compute-0.nqkboe on compute-0
Jan 20 08:58:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e8 new map
Jan 20 08:58:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:53 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 20 08:58:53 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 20 08:58:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:58:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.003000093s ======
Jan 20 08:58:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:54.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000093s
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: Deploying daemon haproxy.rgw.default.compute-2.cuokcs on compute-2
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e9 new map
Jan 20 08:58:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-20T13:58:19.644785+0000#012modified#0112026-01-20T13:58:50.863864+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24178}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.jyxktq{0:24178} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/2187119920,v1:192.168.122.102:6805/2187119920] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.znrafi{-1:14376} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2144836821,v1:192.168.122.100:6807/2144836821] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.rtofcx{-1:24137} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2015191638,v1:192.168.122.101:6805/2015191638] compat {c=[1],r=[1],i=[7ff]}]
Jan 20 08:58:54 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Updating MDS map to version 9 from mon.2
Jan 20 08:58:55 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 20 08:58:55 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 20 08:58:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:58:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:58:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:56.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:58:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:58:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:58:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:58:58.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:58:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 20 08:58:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 08:58:58 np0005588919 ceph-mon[81775]: Deploying daemon keepalived.rgw.default.compute-0.gcjsxe on compute-0
Jan 20 08:58:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:58:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:58:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:58:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:58:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:58:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 20 08:58:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 20 08:59:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:00.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 20 08:59:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:01.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 20 08:59:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:02.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:02 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 20 08:59:02 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 20 08:59:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:03.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:03 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Jan 20 08:59:03 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: Deploying daemon keepalived.rgw.default.compute-2.dleeql on compute-2
Jan 20 08:59:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:04.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:05.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:06.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:07 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 20 08:59:07 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 20 08:59:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 20 08:59:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:08.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 20 08:59:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 20 08:59:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:09.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 20 08:59:09 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 20 08:59:09 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 20 08:59:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 20 08:59:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:10.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 20 08:59:10 np0005588919 podman[84964]: 2026-01-20 13:59:10.585169546 +0000 UTC m=+0.063886878 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 08:59:10 np0005588919 podman[84964]: 2026-01-20 13:59:10.692571942 +0000 UTC m=+0.171289284 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 20 08:59:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 20 08:59:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:11.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 20 08:59:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 20 08:59:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 20 08:59:11 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 56 pg[10.0( v 48'48 (0'0,48'48] local-lis/les=47/48 n=8 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=56 pruub=13.366131783s) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 48'47 mlcod 48'47 active pruub 122.893173218s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:11 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 56 pg[10.0( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=56 pruub=13.366131783s) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 48'47 mlcod 0'0 unknown pruub 122.893173218s@ mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:12.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.12( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.7( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.11( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.10( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.19( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.18( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.6( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.5( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.4( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.3( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.8( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.9( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1( v 48'48 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.2( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.13( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.14( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.15( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.17( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.16( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.10( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.11( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.12( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1f( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1b( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.7( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1d( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1a( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.19( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.18( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.5( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1e( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.3( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.4( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1c( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.d( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.b( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.6( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.c( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.a( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.e( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.f( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.0( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 48'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.1( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.2( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.9( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.15( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.8( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.14( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.16( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.17( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 57 pg[10.13( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=47/47 les/c/f=48/48/0 sis=56) [1] r=0 lpr=56 pi=[47,56)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:13.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 20 08:59:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 20 08:59:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:59:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:59:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:14.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:15.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:16.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:17.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Jan 20 08:59:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1b( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.944127083s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586280823s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.12( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.944046021s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586219788s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.10( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.937613487s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.579887390s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.12( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943946838s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586219788s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1b( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943997383s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586280823s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.10( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.937577248s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.579887390s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.11( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943763733s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586166382s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.11( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943738937s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586166382s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1e( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943623543s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586227417s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1e( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943590164s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586227417s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.19( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943556786s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586349487s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.19( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943525314s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586349487s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.18( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943629265s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586509705s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.18( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943574905s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586509705s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.5( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943511963s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586524963s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.4( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943505287s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586570740s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.5( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943478584s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586524963s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.4( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943471909s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586570740s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.3( v 57'51 (0'0,57'51] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943395615s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 57'50 active pruub 126.586532593s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.3( v 57'51 (0'0,57'51] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943346977s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 0'0 unknown NOTIFY pruub 126.586532593s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.8( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943283081s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586616516s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.8( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943231583s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586616516s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.f( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943265915s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586723328s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.f( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943235397s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586723328s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943240166s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586746216s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.2( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943241119s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586784363s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.1( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943208694s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586746216s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.13( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943148613s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 126.586776733s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.2( v 48'48 (0'0,48'48] local-lis/les=56/57 n=1 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943162918s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586784363s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.13( v 48'48 (0'0,48'48] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943116188s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 126.586776733s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.14( v 57'51 (0'0,57'51] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943144798s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 57'50 active pruub 126.586837769s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.14( v 57'51 (0'0,57'51] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943095207s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 0'0 unknown NOTIFY pruub 126.586837769s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.15( v 57'51 (0'0,57'51] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.943018913s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 57'50 active pruub 126.586799622s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[10.15( v 57'51 (0'0,57'51] local-lis/les=56/57 n=0 ec=56/47 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=10.942932129s) [0] r=-1 lpr=58 pi=[56,58)/1 crt=57'49 lcod 57'50 mlcod 0'0 unknown NOTIFY pruub 126.586799622s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.14( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.14( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.17( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.10( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.12( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.8( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.5( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.f( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.4( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.7( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.4( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.1b( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 20 08:59:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.19( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1a( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1b( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1c( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.18( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1e( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[11.1d( empty local-lis/les=0/0 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 58 pg[8.12( empty local-lis/les=0/0 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 20 08:59:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 20 08:59:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:18.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 20 08:59:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.14( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 20 08:59:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.f( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.17( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.12( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.8( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.4( v 44'4 (0'0,44'4] local-lis/les=58/59 n=1 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.4( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.1b( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.5( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1c( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1d( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.7( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.18( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1e( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.12( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1b( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.14( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.10( v 44'4 lc 0'0 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[8.19( v 44'4 (0'0,44'4] local-lis/les=58/59 n=0 ec=54/43 lis/c=54/54 les/c/f=55/55/0 sis=58) [1] r=0 lpr=58 pi=[54,58)/1 crt=44'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 59 pg[11.1a( empty local-lis/les=58/59 n=0 ec=56/49 lis/c=56/56 les/c/f=57/57/0 sis=58) [1] r=0 lpr=58 pi=[56,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:19.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 20 08:59:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 20 08:59:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:20.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: Reconfiguring mgr.compute-0.wookjv (monmap changed)...
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.wookjv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:59:20 np0005588919 ceph-mon[81775]: Reconfiguring daemon mgr.compute-0.wookjv on compute-0
Jan 20 08:59:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:21.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:22.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 20 08:59:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:23.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.19 deep-scrub starts
Jan 20 08:59:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.19 deep-scrub ok
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.534996337 +0000 UTC m=+0.048494447 container create d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Jan 20 08:59:23 np0005588919 systemd[1]: Started libpod-conmon-d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e.scope.
Jan 20 08:59:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.515015972 +0000 UTC m=+0.028514132 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.63047309 +0000 UTC m=+0.143971220 container init d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2)
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.64103321 +0000 UTC m=+0.154531350 container start d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.64614384 +0000 UTC m=+0.159641990 container attach d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 08:59:23 np0005588919 ecstatic_dubinsky[85271]: 167 167
Jan 20 08:59:23 np0005588919 systemd[1]: libpod-d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e.scope: Deactivated successfully.
Jan 20 08:59:23 np0005588919 conmon[85271]: conmon d17d57c45b395520539c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e.scope/container/memory.events
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.649695621 +0000 UTC m=+0.163193771 container died d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 20 08:59:23 np0005588919 systemd[1]: var-lib-containers-storage-overlay-255be54dc2106e2f65cb406df0c4d761dd87d9ec1f923a35adc787e992629c01-merged.mount: Deactivated successfully.
Jan 20 08:59:23 np0005588919 podman[85255]: 2026-01-20 13:59:23.704222975 +0000 UTC m=+0.217721095 container remove d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 08:59:23 np0005588919 systemd[1]: libpod-conmon-d17d57c45b395520539c37b2b0d223622ba8097ccf5c4cd69dc7b62be66f3a9e.scope: Deactivated successfully.
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: Reconfiguring osd.0 (monmap changed)...
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: Reconfiguring daemon osd.0 on compute-0
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.498685293 +0000 UTC m=+0.060443740 container create 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:24.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:24 np0005588919 systemd[1]: Started libpod-conmon-4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74.scope.
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.470339907 +0000 UTC m=+0.032098434 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:59:24 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.614463681 +0000 UTC m=+0.176222158 container init 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.625540387 +0000 UTC m=+0.187298844 container start 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.629648276 +0000 UTC m=+0.191406743 container attach 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507)
Jan 20 08:59:24 np0005588919 zealous_goldwasser[85422]: 167 167
Jan 20 08:59:24 np0005588919 systemd[1]: libpod-4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74.scope: Deactivated successfully.
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.635040524 +0000 UTC m=+0.196798991 container died 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 08:59:24 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c93eaef64172c5e28ace8db5a9e042af6207a737dd8b34c09e640960c56d1eef-merged.mount: Deactivated successfully.
Jan 20 08:59:24 np0005588919 podman[85406]: 2026-01-20 13:59:24.68707021 +0000 UTC m=+0.248828677 container remove 4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_goldwasser, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 20 08:59:24 np0005588919 systemd[1]: libpod-conmon-4834c250907bb123c98d3556ee2aca657e499a4733da86cb555a14b0e1b73d74.scope: Deactivated successfully.
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring osd.1 (monmap changed)...
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring daemon osd.1 on compute-1
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:24 np0005588919 ceph-mon[81775]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 20 08:59:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:25.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.613185912 +0000 UTC m=+0.069408200 container create c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0)
Jan 20 08:59:25 np0005588919 systemd[1]: Started libpod-conmon-c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a.scope.
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.58432654 +0000 UTC m=+0.040548888 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 08:59:25 np0005588919 systemd[1]: Started libcrun container.
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.708240593 +0000 UTC m=+0.164462871 container init c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.718773912 +0000 UTC m=+0.174996180 container start c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.724092488 +0000 UTC m=+0.180314786 container attach c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 20 08:59:25 np0005588919 unruffled_hugle[85581]: 167 167
Jan 20 08:59:25 np0005588919 systemd[1]: libpod-c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a.scope: Deactivated successfully.
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.727555796 +0000 UTC m=+0.183778084 container died c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:25 np0005588919 systemd[1]: var-lib-containers-storage-overlay-3eaefe0693fce7abb33f2eeeb47dba1be60f3be85d42b1d106611815fb906926-merged.mount: Deactivated successfully.
Jan 20 08:59:25 np0005588919 podman[85564]: 2026-01-20 13:59:25.781979297 +0000 UTC m=+0.238201595 container remove c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=unruffled_hugle, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 08:59:25 np0005588919 systemd[1]: libpod-conmon-c4f72d32f375654dfd77a7297ca2f1d66e63a41b28c4791e5a65e8f6db59be1a.scope: Deactivated successfully.
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 20 08:59:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 20 08:59:26 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Jan 20 08:59:26 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Jan 20 08:59:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:26.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: Reconfiguring mgr.compute-2.gunjko (monmap changed)...
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gunjko", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 20 08:59:26 np0005588919 ceph-mon[81775]: Reconfiguring daemon mgr.compute-2.gunjko on compute-2
Jan 20 08:59:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 20 08:59:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:27.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 20 08:59:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 20 08:59:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 20 08:59:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:28.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:28 np0005588919 podman[85773]: 2026-01-20 13:59:28.723533802 +0000 UTC m=+0.086751032 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 08:59:28 np0005588919 podman[85773]: 2026-01-20 13:59:28.843654056 +0000 UTC m=+0.206871256 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 20 08:59:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 20 08:59:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 20 08:59:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:29.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.e deep-scrub starts
Jan 20 08:59:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.e deep-scrub ok
Jan 20 08:59:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:30.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 20 08:59:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 08:59:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 08:59:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 20 08:59:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:31.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:31 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Jan 20 08:59:31 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Jan 20 08:59:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 20 08:59:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 20 08:59:32 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Jan 20 08:59:32 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Jan 20 08:59:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:32.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 20 08:59:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 20 08:59:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:33.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 20 08:59:33 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 67 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=67) [1] r=0 lpr=67 pi=[54,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:33 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 67 pg[9.6( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=67) [1] r=0 lpr=67 pi=[54,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:33 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 67 pg[9.1e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=67) [1] r=0 lpr=67 pi=[54,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:33 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 67 pg[9.e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=67) [1] r=0 lpr=67 pi=[54,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 20 08:59:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 20 08:59:34 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 20 08:59:34 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 20 08:59:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:34.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:35 np0005588919 systemd[1]: session-19.scope: Deactivated successfully.
Jan 20 08:59:35 np0005588919 systemd[1]: session-19.scope: Consumed 10.031s CPU time.
Jan 20 08:59:35 np0005588919 systemd-logind[783]: Session 19 logged out. Waiting for processes to exit.
Jan 20 08:59:35 np0005588919 systemd-logind[783]: Removed session 19.
Jan 20 08:59:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:35.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.1e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.1e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.6( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.6( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.e( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 68 pg[9.16( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=68) [1]/[0] r=-1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.d scrub starts
Jan 20 08:59:36 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.d scrub ok
Jan 20 08:59:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:36.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 20 08:59:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 20 08:59:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 20 08:59:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:37.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.e( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.e( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.6( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.6( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 70 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:38.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 08:59:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 20 08:59:39 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 71 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:39 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 71 pg[9.6( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:39 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 71 pg[9.e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=6 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:39 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 71 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=68/54 les/c/f=69/55/0 sis=70) [1] r=0 lpr=70 pi=[54,70)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:39.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:40.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:41.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 20 08:59:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 20 08:59:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:42.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 20 08:59:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 20 08:59:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:43.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 20 08:59:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 20 08:59:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 20 08:59:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:44.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 20 08:59:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 20 08:59:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:45.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 20 08:59:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 20 08:59:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:46.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.5 deep-scrub starts
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.5 deep-scrub ok
Jan 20 08:59:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 20 08:59:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 77 pg[9.1a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=77) [1] r=0 lpr=77 pi=[54,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 77 pg[9.a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=77) [1] r=0 lpr=77 pi=[54,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:47.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 78 pg[9.a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=78) [1]/[0] r=-1 lpr=78 pi=[54,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 78 pg[9.1a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=78) [1]/[0] r=-1 lpr=78 pi=[54,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 78 pg[9.1a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=78) [1]/[0] r=-1 lpr=78 pi=[54,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 78 pg[9.a( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=78) [1]/[0] r=-1 lpr=78 pi=[54,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 08:59:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 20 08:59:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:48.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 20 08:59:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 80 pg[9.a( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 80 pg[9.a( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 80 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 08:59:49 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 80 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 08:59:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 20 08:59:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 81 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=5 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 81 pg[9.a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=6 ec=54/45 lis/c=78/54 les/c/f=79/55/0 sis=80) [1] r=0 lpr=80 pi=[54,80)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 08:59:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:51.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:52 np0005588919 systemd-logind[783]: New session 33 of user zuul.
Jan 20 08:59:52 np0005588919 systemd[1]: Started Session 33 of User zuul.
Jan 20 08:59:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:52.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:53 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 20 08:59:53 np0005588919 python3.9[86228]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 08:59:53 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 20 08:59:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:53.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 08:59:55 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.7 deep-scrub starts
Jan 20 08:59:55 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.7 deep-scrub ok
Jan 20 08:59:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:56 np0005588919 python3.9[86442]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 08:59:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:56.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 08:59:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:57.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 08:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 20 08:59:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 20 08:59:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 08:59:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:13:59:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 08:59:59 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 20 08:59:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 08:59:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 08:59:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 08:59:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:13:59:59.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 20 09:00:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:00:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 20 09:00:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:01 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 20 09:00:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:01.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 20 09:00:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 84 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84) [1] r=0 lpr=84 pi=[68,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 84 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=84) [1] r=0 lpr=84 pi=[68,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 85 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=-1 lpr=85 pi=[68,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 85 pg[9.1d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=-1 lpr=85 pi=[68,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 85 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=-1 lpr=85 pi=[68,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:02 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 85 pg[9.d( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=85) [1]/[2] r=-1 lpr=85 pi=[68,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 20 09:00:03 np0005588919 systemd[1]: session-33.scope: Deactivated successfully.
Jan 20 09:00:03 np0005588919 systemd[1]: session-33.scope: Consumed 8.313s CPU time.
Jan 20 09:00:03 np0005588919 systemd-logind[783]: Session 33 logged out. Waiting for processes to exit.
Jan 20 09:00:03 np0005588919 systemd-logind[783]: Removed session 33.
Jan 20 09:00:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:03.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 20 09:00:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 20 09:00:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 20 09:00:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:04.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 87 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 87 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 87 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:04 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 87 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Jan 20 09:00:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 88 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=5 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 88 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88) [1] r=0 lpr=88 pi=[64,88)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 88 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=88) [1] r=0 lpr=88 pi=[64,88)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:05 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 88 pg[9.d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=6 ec=54/45 lis/c=85/68 les/c/f=86/69/0 sis=87) [1] r=0 lpr=87 pi=[68,87)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 20 09:00:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 20 09:00:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:06.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 20 09:00:06 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 89 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=-1 lpr=89 pi=[64,89)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:06 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 89 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=-1 lpr=89 pi=[64,89)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:06 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 89 pg[9.1f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=-1 lpr=89 pi=[64,89)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:06 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 89 pg[9.f( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=64/64 les/c/f=65/65/0 sis=89) [1]/[2] r=-1 lpr=89 pi=[64,89)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:07.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 20 09:00:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 20 09:00:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 91 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 91 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 91 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:09 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 91 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:09.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 20 09:00:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 92 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=5 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:10 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 92 pg[9.f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=6 ec=54/45 lis/c=89/64 les/c/f=90/65/0 sis=91) [1] r=0 lpr=91 pi=[64,91)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:10.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 20 09:00:11 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 93 pg[9.10( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=93) [1] r=0 lpr=93 pi=[54,93)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 20 09:00:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 20 09:00:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 94 pg[9.10( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=94) [1]/[0] r=-1 lpr=94 pi=[54,94)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:12 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 94 pg[9.10( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=94) [1]/[0] r=-1 lpr=94 pi=[54,94)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:12.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 20 09:00:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 20 09:00:13 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 95 pg[9.11( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=95) [1] r=0 lpr=95 pi=[54,95)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 20 09:00:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 20 09:00:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 20 09:00:14 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 96 pg[9.10( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=94/54 les/c/f=95/55/0 sis=96) [1] r=0 lpr=96 pi=[54,96)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:14 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 96 pg[9.11( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=96) [1]/[0] r=-1 lpr=96 pi=[54,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:14 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 96 pg[9.10( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=94/54 les/c/f=95/55/0 sis=96) [1] r=0 lpr=96 pi=[54,96)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:14 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 96 pg[9.11( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=96) [1]/[0] r=-1 lpr=96 pi=[54,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:14.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 20 09:00:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 97 pg[9.12( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=97) [1] r=0 lpr=97 pi=[54,97)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:15 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 97 pg[9.10( v 51'1000 (0'0,51'1000] local-lis/les=96/97 n=6 ec=54/45 lis/c=94/54 les/c/f=95/55/0 sis=96) [1] r=0 lpr=96 pi=[54,96)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 20 09:00:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:15.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 20 09:00:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 98 pg[9.12( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=98) [1]/[0] r=-1 lpr=98 pi=[54,98)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 98 pg[9.12( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=54/54 les/c/f=55/55/0 sis=98) [1]/[0] r=-1 lpr=98 pi=[54,98)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 98 pg[9.11( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=96/54 les/c/f=97/55/0 sis=98) [1] r=0 lpr=98 pi=[54,98)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:16 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 98 pg[9.11( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=6 ec=54/45 lis/c=96/54 les/c/f=97/55/0 sis=98) [1] r=0 lpr=98 pi=[54,98)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 20 09:00:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:16.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 20 09:00:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 20 09:00:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 20 09:00:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 99 pg[9.11( v 51'1000 (0'0,51'1000] local-lis/les=98/99 n=6 ec=54/45 lis/c=96/54 les/c/f=97/55/0 sis=98) [1] r=0 lpr=98 pi=[54,98)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:17.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 20 09:00:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 100 pg[9.12( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=98/54 les/c/f=99/55/0 sis=100) [1] r=0 lpr=100 pi=[54,100)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:17 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 100 pg[9.12( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=98/54 les/c/f=99/55/0 sis=100) [1] r=0 lpr=100 pi=[54,100)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:18 np0005588919 systemd-logind[783]: New session 34 of user zuul.
Jan 20 09:00:18 np0005588919 systemd[1]: Started Session 34 of User zuul.
Jan 20 09:00:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 20 09:00:18 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 101 pg[9.12( v 51'1000 (0'0,51'1000] local-lis/les=100/101 n=5 ec=54/45 lis/c=98/54 les/c/f=99/55/0 sis=100) [1] r=0 lpr=100 pi=[54,100)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:19 np0005588919 python3.9[86654]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 20 09:00:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:19.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:20 np0005588919 python3.9[86828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:21 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.d scrub starts
Jan 20 09:00:21 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.d scrub ok
Jan 20 09:00:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 20 09:00:21 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 20 09:00:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:21.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:21 np0005588919 python3.9[86984]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:00:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:22.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 20 09:00:22 np0005588919 python3.9[87137]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:00:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 20 09:00:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 20 09:00:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:23 np0005588919 python3.9[87291]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:00:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:24.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 20 09:00:24 np0005588919 python3.9[87443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:00:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 20 09:00:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 20 09:00:25 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 104 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=104) [1] r=0 lpr=104 pi=[68,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:25 np0005588919 python3.9[87593]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:00:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:25.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:25 np0005588919 network[87610]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:00:25 np0005588919 network[87611]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:00:25 np0005588919 network[87612]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:00:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:26.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 20 09:00:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 105 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=105) [1]/[2] r=-1 lpr=105 pi=[68,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:26 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 105 pg[9.15( empty local-lis/les=0/0 n=0 ec=54/45 lis/c=68/68 les/c/f=69/69/0 sis=105) [1]/[2] r=-1 lpr=105 pi=[68,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 20 09:00:27 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Jan 20 09:00:27 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Jan 20 09:00:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 20 09:00:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 20 09:00:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:27.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 106 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=106 pruub=14.820875168s) [2] r=-1 lpr=106 pi=[70,106)/1 crt=51'1000 mlcod 0'0 active pruub 200.749862671s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 106 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=106 pruub=14.820657730s) [2] r=-1 lpr=106 pi=[70,106)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 200.749862671s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:28.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 20 09:00:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 107 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=107) [2]/[1] r=0 lpr=107 pi=[70,107)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 107 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=105/68 les/c/f=106/69/0 sis=107) [1] r=0 lpr=107 pi=[68,107)/1 luod=0'0 crt=51'1000 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 107 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=0/0 n=5 ec=54/45 lis/c=105/68 les/c/f=106/69/0 sis=107) [1] r=0 lpr=107 pi=[68,107)/1 crt=51'1000 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:28 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 107 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=107) [2]/[1] r=0 lpr=107 pi=[70,107)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:29.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 20 09:00:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 20 09:00:29 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 108 pg[9.15( v 51'1000 (0'0,51'1000] local-lis/les=107/108 n=5 ec=54/45 lis/c=105/68 les/c/f=106/69/0 sis=107) [1] r=0 lpr=107 pi=[68,107)/1 crt=51'1000 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:29 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 108 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=107/108 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=107) [2]/[1] async=[2] r=0 lpr=107 pi=[70,107)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:30 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 20 09:00:30 np0005588919 python3.9[87872]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:00:30 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 20 09:00:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:00:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:30.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:00:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 20 09:00:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 20 09:00:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 109 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=107/108 n=5 ec=54/45 lis/c=107/70 les/c/f=108/71/0 sis=109 pruub=14.965022087s) [2] async=[2] r=-1 lpr=109 pi=[70,109)/1 crt=51'1000 mlcod 51'1000 active pruub 203.624725342s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:30 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 109 pg[9.16( v 51'1000 (0'0,51'1000] local-lis/les=107/108 n=5 ec=54/45 lis/c=107/70 les/c/f=108/71/0 sis=109 pruub=14.964905739s) [2] r=-1 lpr=109 pi=[70,109)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 203.624725342s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:31 np0005588919 python3.9[88022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:31.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 20 09:00:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 20 09:00:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:32.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:32 np0005588919 python3.9[88176]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:00:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 20 09:00:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:33.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:34 np0005588919 python3.9[88334]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:00:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:35.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 20 09:00:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 20 09:00:35 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 20 09:00:35 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 20 09:00:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 20 09:00:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 20 09:00:35 np0005588919 python3.9[88419]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:00:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:35.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 20 09:00:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 112 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=5 ec=54/45 lis/c=80/80 les/c/f=81/81/0 sis=112 pruub=10.714376450s) [0] r=-1 lpr=112 pi=[80,112)/1 crt=51'1000 mlcod 0'0 active pruub 204.693359375s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:36 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 112 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=5 ec=54/45 lis/c=80/80 les/c/f=81/81/0 sis=112 pruub=10.714286804s) [0] r=-1 lpr=112 pi=[80,112)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 204.693359375s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 20 09:00:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:37.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 20 09:00:37 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 113 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=5 ec=54/45 lis/c=80/80 les/c/f=81/81/0 sis=113) [0]/[1] r=0 lpr=113 pi=[80,113)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:37 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 113 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=80/81 n=5 ec=54/45 lis/c=80/80 les/c/f=81/81/0 sis=113) [0]/[1] r=0 lpr=113 pi=[80,113)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 20 09:00:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 20 09:00:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:37.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 20 09:00:37 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 114 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=113/114 n=5 ec=54/45 lis/c=80/80 les/c/f=81/81/0 sis=113) [0]/[1] async=[0] r=0 lpr=113 pi=[80,113)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 20 09:00:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 115 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=113/114 n=5 ec=54/45 lis/c=113/80 les/c/f=114/81/0 sis=115 pruub=14.986879349s) [0] async=[0] r=-1 lpr=115 pi=[80,115)/1 crt=51'1000 mlcod 51'1000 active pruub 211.692581177s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:38 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 115 pg[9.1a( v 51'1000 (0'0,51'1000] local-lis/les=113/114 n=5 ec=54/45 lis/c=113/80 les/c/f=114/81/0 sis=115 pruub=14.986777306s) [0] r=-1 lpr=115 pi=[80,115)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 211.692581177s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:00:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:39.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:00:39 np0005588919 podman[88653]: 2026-01-20 14:00:39.487379941 +0000 UTC m=+0.099471148 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 20 09:00:39 np0005588919 podman[88653]: 2026-01-20 14:00:39.587460664 +0000 UTC m=+0.199551861 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 20 09:00:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:39.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 20 09:00:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 20 09:00:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:41.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:00:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:41.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:43.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 20 09:00:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 20 09:00:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:43.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:00:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:45.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:00:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 20 09:00:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 20 09:00:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:00:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:45.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:00:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 20 09:00:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:47.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.665381) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647665506, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6990, "num_deletes": 255, "total_data_size": 12890158, "memory_usage": 13091344, "flush_reason": "Manual Compaction"}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647773791, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7725961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 254, "largest_seqno": 6995, "table_properties": {"data_size": 7698442, "index_size": 17996, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8453, "raw_key_size": 79151, "raw_average_key_size": 23, "raw_value_size": 7632919, "raw_average_value_size": 2268, "num_data_blocks": 797, "num_entries": 3365, "num_filter_entries": 3365, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 1768917474, "file_creation_time": 1768917647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 108513 microseconds, and 27677 cpu microseconds.
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.773896) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7725961 bytes OK
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.773923) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.775584) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.775607) EVENT_LOG_v1 {"time_micros": 1768917647775600, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.775627) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 12853057, prev total WAL file size 12853057, number of live WAL files 2.
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.779645) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7544KB) 8(1648B)]
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647779762, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7727609, "oldest_snapshot_seqno": -1}
Jan 20 09:00:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:47.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3114 keys, 7722426 bytes, temperature: kUnknown
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647894712, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7722426, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7695596, "index_size": 17952, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7813, "raw_key_size": 74973, "raw_average_key_size": 24, "raw_value_size": 7633199, "raw_average_value_size": 2451, "num_data_blocks": 796, "num_entries": 3114, "num_filter_entries": 3114, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768917647, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.894939) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7722426 bytes
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.897770) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 67.2 rd, 67.1 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(7.4, 0.0 +0.0 blob) out(7.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3370, records dropped: 256 output_compression: NoCompression
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.897791) EVENT_LOG_v1 {"time_micros": 1768917647897782, "job": 4, "event": "compaction_finished", "compaction_time_micros": 115006, "compaction_time_cpu_micros": 32413, "output_level": 6, "num_output_files": 1, "total_output_size": 7722426, "num_input_records": 3370, "num_output_records": 3114, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647899112, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917647899155, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:00:47.779510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:00:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 119 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=5 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=119 pruub=14.054931641s) [2] r=-1 lpr=119 pi=[87,119)/1 crt=51'1000 mlcod 0'0 active pruub 219.684005737s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 119 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=5 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=119 pruub=14.054023743s) [2] r=-1 lpr=119 pi=[87,119)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 219.684005737s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 20 09:00:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 120 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=5 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=120) [2]/[1] r=0 lpr=120 pi=[87,120)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:47 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 120 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=87/88 n=5 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=120) [2]/[1] r=0 lpr=120 pi=[87,120)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:48 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.f deep-scrub starts
Jan 20 09:00:48 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.f deep-scrub ok
Jan 20 09:00:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 20 09:00:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 20 09:00:49 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 121 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=120/121 n=5 ec=54/45 lis/c=87/87 les/c/f=88/88/0 sis=120) [2]/[1] async=[2] r=0 lpr=120 pi=[87,120)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:49.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 20 09:00:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 122 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=120/121 n=5 ec=54/45 lis/c=120/87 les/c/f=121/88/0 sis=122 pruub=14.988829613s) [2] async=[2] r=-1 lpr=122 pi=[87,122)/1 crt=51'1000 mlcod 51'1000 active pruub 222.735855103s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 122 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=122 pruub=8.999602318s) [0] r=-1 lpr=122 pi=[70,122)/1 crt=51'1000 mlcod 0'0 active pruub 216.746673584s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 122 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=122 pruub=8.999547005s) [0] r=-1 lpr=122 pi=[70,122)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 216.746673584s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 122 pg[9.1d( v 51'1000 (0'0,51'1000] local-lis/les=120/121 n=5 ec=54/45 lis/c=120/87 les/c/f=121/88/0 sis=122 pruub=14.988019943s) [2] r=-1 lpr=122 pi=[87,122)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 222.735855103s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.a scrub starts
Jan 20 09:00:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.a scrub ok
Jan 20 09:00:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 20 09:00:51 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 123 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=123) [0]/[1] r=0 lpr=123 pi=[70,123)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:51 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 123 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=70/71 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=123) [0]/[1] r=0 lpr=123 pi=[70,123)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 20 09:00:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:51.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:51.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 124 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=5 ec=54/45 lis/c=91/91 les/c/f=92/92/0 sis=124 pruub=14.281917572s) [0] r=-1 lpr=124 pi=[91,124)/1 crt=51'1000 mlcod 0'0 active pruub 224.046585083s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 124 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=5 ec=54/45 lis/c=91/91 les/c/f=92/92/0 sis=124 pruub=14.281862259s) [0] r=-1 lpr=124 pi=[91,124)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 224.046585083s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 20 09:00:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 124 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=123/124 n=5 ec=54/45 lis/c=70/70 les/c/f=71/71/0 sis=123) [0]/[1] async=[0] r=0 lpr=123 pi=[70,123)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.c scrub starts
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.c scrub ok
Jan 20 09:00:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 125 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=5 ec=54/45 lis/c=91/91 les/c/f=92/92/0 sis=125) [0]/[1] r=0 lpr=125 pi=[91,125)/1 crt=51'1000 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 125 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=91/92 n=5 ec=54/45 lis/c=91/91 les/c/f=92/92/0 sis=125) [0]/[1] r=0 lpr=125 pi=[91,125)/1 crt=51'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 125 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=123/124 n=5 ec=54/45 lis/c=123/70 les/c/f=124/71/0 sis=125 pruub=15.074364662s) [0] async=[0] r=-1 lpr=125 pi=[70,125)/1 crt=51'1000 mlcod 51'1000 active pruub 225.775054932s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:52 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 125 pg[9.1e( v 51'1000 (0'0,51'1000] local-lis/les=123/124 n=5 ec=54/45 lis/c=123/70 les/c/f=124/71/0 sis=125 pruub=15.074002266s) [0] r=-1 lpr=125 pi=[70,125)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 225.775054932s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:53.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:53.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 20 09:00:54 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 126 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=125/126 n=5 ec=54/45 lis/c=91/91 les/c/f=92/92/0 sis=125) [0]/[1] async=[0] r=0 lpr=125 pi=[91,125)/1 crt=51'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 20 09:00:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 20 09:00:55 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 127 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=125/126 n=5 ec=54/45 lis/c=125/91 les/c/f=126/92/0 sis=127 pruub=14.990097046s) [0] async=[0] r=-1 lpr=127 pi=[91,127)/1 crt=51'1000 mlcod 51'1000 active pruub 227.743103027s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 20 09:00:55 np0005588919 ceph-osd[79119]: osd.1 pg_epoch: 127 pg[9.1f( v 51'1000 (0'0,51'1000] local-lis/les=125/126 n=5 ec=54/45 lis/c=125/91 les/c/f=126/92/0 sis=127 pruub=14.989865303s) [0] r=-1 lpr=127 pi=[91,127)/1 crt=51'1000 mlcod 0'0 unknown NOTIFY pruub 227.743103027s@ mbc={}] state<Start>: transitioning to Stray
Jan 20 09:00:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:00:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:55.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:55.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 20 09:00:56 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 20 09:00:56 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 20 09:00:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:57.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:00:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:57.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:00:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:00:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:00:59.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:00:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Jan 20 09:00:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Jan 20 09:00:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:00:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:00:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:00:59.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:00 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.a scrub starts
Jan 20 09:01:00 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.a scrub ok
Jan 20 09:01:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:01.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:01.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:03.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:03.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:05.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:05.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:06 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 20 09:01:06 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 20 09:01:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:07.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:07 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Jan 20 09:01:07 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Jan 20 09:01:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:07.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:09.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:09.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:11.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:11.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Jan 20 09:01:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Jan 20 09:01:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 20 09:01:13 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 20 09:01:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:13.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:13.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:15.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:15.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:01:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:17.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:01:18 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 20 09:01:18 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 20 09:01:19 np0005588919 python3.9[89205]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:01:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:19.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:19.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:21 np0005588919 python3.9[89492]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 20 09:01:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:21.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:21.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:22 np0005588919 python3.9[89644]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 20 09:01:22 np0005588919 python3.9[89796]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:01:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:23.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:23.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:23 np0005588919 python3.9[89948]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 20 09:01:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:25.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:25 np0005588919 python3.9[90100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:25.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:26 np0005588919 python3.9[90252]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:26 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 20 09:01:26 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 20 09:01:26 np0005588919 python3.9[90330]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:01:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:27.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:27.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:28 np0005588919 python3.9[90482]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:29.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:29 np0005588919 python3.9[90636]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 20 09:01:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:29.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:30 np0005588919 python3.9[90789]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 20 09:01:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:01:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:31.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:01:31 np0005588919 python3.9[90942]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:01:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:31.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:32 np0005588919 python3.9[91095]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 20 09:01:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:33.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:33 np0005588919 python3.9[91248]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:01:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:33.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:35.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:35 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Jan 20 09:01:35 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Jan 20 09:01:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:35.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:36 np0005588919 python3.9[91401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:36 np0005588919 python3.9[91553]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Jan 20 09:01:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:37.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Jan 20 09:01:37 np0005588919 python3.9[91631]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:37.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:38 np0005588919 python3.9[91783]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:01:38 np0005588919 python3.9[91861]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:01:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:39.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:39.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:40 np0005588919 python3.9[92013]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:01:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:40 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 20 09:01:40 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 20 09:01:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:41.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:41.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:42 np0005588919 python3.9[92164]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:42 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 20 09:01:42 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 20 09:01:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:43.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:43 np0005588919 python3.9[92316]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 20 09:01:43 np0005588919 systemd[72729]: Created slice User Background Tasks Slice.
Jan 20 09:01:43 np0005588919 systemd[72729]: Starting Cleanup of User's Temporary Files and Directories...
Jan 20 09:01:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:43.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:43 np0005588919 systemd[72729]: Finished Cleanup of User's Temporary Files and Directories.
Jan 20 09:01:44 np0005588919 python3.9[92466]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:01:44 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Jan 20 09:01:44 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Jan 20 09:01:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:45.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:45 np0005588919 python3.9[92619]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:45 np0005588919 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 20 09:01:45 np0005588919 systemd[1]: tuned.service: Deactivated successfully.
Jan 20 09:01:45 np0005588919 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 20 09:01:45 np0005588919 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 20 09:01:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:45.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:46 np0005588919 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 20 09:01:46 np0005588919 python3.9[92781]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 20 09:01:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:47.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:47.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:48 np0005588919 podman[92978]: 2026-01-20 14:01:48.875367221 +0000 UTC m=+0.063430002 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:01:49 np0005588919 podman[92978]: 2026-01-20 14:01:49.008259778 +0000 UTC m=+0.196322499 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 20 09:01:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:49.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:49.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.d scrub starts
Jan 20 09:01:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 3.d scrub ok
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:51 np0005588919 python3.9[93364]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:51.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:51 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Jan 20 09:01:51 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Jan 20 09:01:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:01:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:01:51 np0005588919 python3.9[93518]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:01:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:51.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 20 09:01:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 20 09:01:52 np0005588919 systemd[1]: session-34.scope: Deactivated successfully.
Jan 20 09:01:52 np0005588919 systemd[1]: session-34.scope: Consumed 1min 6.332s CPU time.
Jan 20 09:01:52 np0005588919 systemd-logind[783]: Session 34 logged out. Waiting for processes to exit.
Jan 20 09:01:52 np0005588919 systemd-logind[783]: Removed session 34.
Jan 20 09:01:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:53.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:01:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:53.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:01:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:01:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:55.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:55.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:57.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:01:57 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.9 deep-scrub starts
Jan 20 09:01:57 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.9 deep-scrub ok
Jan 20 09:01:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:01:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:57.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:01:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:01:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 20 09:01:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 20 09:01:58 np0005588919 systemd-logind[783]: New session 35 of user zuul.
Jan 20 09:01:58 np0005588919 systemd[1]: Started Session 35 of User zuul.
Jan 20 09:01:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.b scrub starts
Jan 20 09:01:59 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.b scrub ok
Jan 20 09:01:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:01:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:01:59.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:01:59 np0005588919 python3.9[93749]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:01:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:01:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:01:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:01:59.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:00 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 20 09:02:00 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 20 09:02:01 np0005588919 python3.9[93905]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 20 09:02:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:01.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.d scrub starts
Jan 20 09:02:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.d scrub ok
Jan 20 09:02:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:01.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:02 np0005588919 python3.9[94058]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:03 np0005588919 python3.9[94142]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 09:02:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:02:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:03.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:02:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:03.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.e deep-scrub starts
Jan 20 09:02:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.e deep-scrub ok
Jan 20 09:02:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 20 09:02:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 20 09:02:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:05.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:05 np0005588919 python3.9[94295]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:05.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:07.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:07 np0005588919 python3.9[94448]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:02:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:02:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:07.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:02:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:09.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:09 np0005588919 python3.9[94601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:09.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:10 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.17 deep-scrub starts
Jan 20 09:02:10 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.17 deep-scrub ok
Jan 20 09:02:11 np0005588919 python3.9[94753]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 20 09:02:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:11.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 20 09:02:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 20 09:02:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:11.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:12 np0005588919 python3.9[94905]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 20 09:02:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 20 09:02:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:13.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:13.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:14 np0005588919 python3.9[95063]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:15.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:16.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:16 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1d deep-scrub starts
Jan 20 09:02:16 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1d deep-scrub ok
Jan 20 09:02:16 np0005588919 python3.9[95216]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:17.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:18.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:18 np0005588919 python3.9[95503]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 20 09:02:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Jan 20 09:02:19 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Jan 20 09:02:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:19.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:19 np0005588919 python3.9[95653]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:02:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:20.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:20 np0005588919 python3.9[95807]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:21 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Jan 20 09:02:21 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Jan 20 09:02:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:21.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:22.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:22 np0005588919 python3.9[95960]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Jan 20 09:02:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Jan 20 09:02:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:23.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:24.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:24 np0005588919 python3.9[96113]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:02:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:25.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:25 np0005588919 python3.9[96267]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 20 09:02:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:26.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:27 np0005588919 systemd[1]: session-35.scope: Deactivated successfully.
Jan 20 09:02:27 np0005588919 systemd[1]: session-35.scope: Consumed 18.780s CPU time.
Jan 20 09:02:27 np0005588919 systemd-logind[783]: Session 35 logged out. Waiting for processes to exit.
Jan 20 09:02:27 np0005588919 systemd-logind[783]: Removed session 35.
Jan 20 09:02:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:02:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:27.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:02:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:28.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Jan 20 09:02:29 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Jan 20 09:02:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:29.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:30.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:31 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.10 deep-scrub starts
Jan 20 09:02:31 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.10 deep-scrub ok
Jan 20 09:02:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:31.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:32.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:33.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:33 np0005588919 systemd-logind[783]: New session 36 of user zuul.
Jan 20 09:02:33 np0005588919 systemd[1]: Started Session 36 of User zuul.
Jan 20 09:02:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:35 np0005588919 python3.9[96447]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:35.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:36 np0005588919 python3.9[96603]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Jan 20 09:02:37 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Jan 20 09:02:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:37.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:38 np0005588919 python3.9[96796]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:38.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:38 np0005588919 systemd[1]: session-36.scope: Deactivated successfully.
Jan 20 09:02:38 np0005588919 systemd[1]: session-36.scope: Consumed 2.741s CPU time.
Jan 20 09:02:38 np0005588919 systemd-logind[783]: Session 36 logged out. Waiting for processes to exit.
Jan 20 09:02:38 np0005588919 systemd-logind[783]: Removed session 36.
Jan 20 09:02:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:39.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 20 09:02:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:40.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 20 09:02:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:41.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:42.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:42 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 20 09:02:42 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 20 09:02:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.f scrub starts
Jan 20 09:02:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.f scrub ok
Jan 20 09:02:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:43.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:44 np0005588919 systemd-logind[783]: New session 37 of user zuul.
Jan 20 09:02:44 np0005588919 systemd[1]: Started Session 37 of User zuul.
Jan 20 09:02:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:44.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:44 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 20 09:02:44 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 20 09:02:45 np0005588919 python3.9[96975]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:45.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:46.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:46 np0005588919 python3.9[97129]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:02:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:47.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:47 np0005588919 python3.9[97285]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:48.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:48 np0005588919 python3.9[97369]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:02:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:49.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:50.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:50 np0005588919 python3.9[97522]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:02:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:51.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:52.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 20 09:02:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 20 09:02:52 np0005588919 python3.9[97717]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:02:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:53.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:54.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:54 np0005588919 python3.9[97869]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:02:54 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 20 09:02:54 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 20 09:02:55 np0005588919 python3.9[98032]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:02:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:02:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:55.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:55 np0005588919 python3.9[98110]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:02:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:56 np0005588919 python3.9[98262]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:02:57 np0005588919 python3.9[98340]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:02:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:57.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:02:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:02:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:02:58.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:02:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 20 09:02:58 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 20 09:02:58 np0005588919 python3.9[98611]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:59 np0005588919 python3.9[98775]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:02:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:02:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:02:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:02:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:02:59 np0005588919 python3.9[98927]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:00.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:03:00 np0005588919 python3.9[99081]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Jan 20 09:03:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Jan 20 09:03:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:01 np0005588919 python3.9[99233]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:03:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:02.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:03.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:04 np0005588919 python3.9[99388]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:03:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 20 09:03:04 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 20 09:03:05 np0005588919 python3.9[99542]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:03:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 20 09:03:05 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 20 09:03:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:05.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:05 np0005588919 python3.9[99694]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:03:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:06.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:03:06 np0005588919 python3.9[99896]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:03:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:07 np0005588919 python3.9[100049]: ansible-service_facts Invoked
Jan 20 09:03:08 np0005588919 network[100066]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:03:08 np0005588919 network[100067]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:03:08 np0005588919 network[100068]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:03:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:08.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:08 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 20 09:03:08 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 20 09:03:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:10.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:10 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Jan 20 09:03:10 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Jan 20 09:03:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:11.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Jan 20 09:03:11 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Jan 20 09:03:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:12.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.12 deep-scrub starts
Jan 20 09:03:12 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.12 deep-scrub ok
Jan 20 09:03:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:15.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:16.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:16 np0005588919 python3.9[100520]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:03:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 20 09:03:17 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 20 09:03:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:18.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:19.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:19 np0005588919 python3.9[100673]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 20 09:03:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:20.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:21 np0005588919 python3.9[100825]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:21.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:22 np0005588919 python3.9[100903]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:23 np0005588919 python3.9[101055]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:23.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:23 np0005588919 python3.9[101133]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1b deep-scrub starts
Jan 20 09:03:23 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 11.1b deep-scrub ok
Jan 20 09:03:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:24.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:25 np0005588919 python3.9[101285]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:26.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:27 np0005588919 python3.9[101437]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:03:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:27.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:27 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 20 09:03:27 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 20 09:03:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:28.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:28 np0005588919 python3.9[101521]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:03:29 np0005588919 systemd[1]: session-37.scope: Deactivated successfully.
Jan 20 09:03:29 np0005588919 systemd[1]: session-37.scope: Consumed 26.984s CPU time.
Jan 20 09:03:29 np0005588919 systemd-logind[783]: Session 37 logged out. Waiting for processes to exit.
Jan 20 09:03:29 np0005588919 systemd-logind[783]: Removed session 37.
Jan 20 09:03:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:29.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:30.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:31.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:33.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:33 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 20 09:03:33 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 20 09:03:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:34 np0005588919 systemd-logind[783]: New session 38 of user zuul.
Jan 20 09:03:34 np0005588919 systemd[1]: Started Session 38 of User zuul.
Jan 20 09:03:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:35 np0005588919 python3.9[101703]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:35.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:36 np0005588919 python3.9[101855]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:36 np0005588919 python3.9[101933]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:37 np0005588919 systemd[1]: session-38.scope: Deactivated successfully.
Jan 20 09:03:37 np0005588919 systemd[1]: session-38.scope: Consumed 1.801s CPU time.
Jan 20 09:03:37 np0005588919 systemd-logind[783]: Session 38 logged out. Waiting for processes to exit.
Jan 20 09:03:37 np0005588919 systemd-logind[783]: Removed session 38.
Jan 20 09:03:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:37.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:38.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:39.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:40.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:41 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.6 scrub starts
Jan 20 09:03:41 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.6 scrub ok
Jan 20 09:03:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:42.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:42 np0005588919 systemd-logind[783]: New session 39 of user zuul.
Jan 20 09:03:42 np0005588919 systemd[1]: Started Session 39 of User zuul.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.813500) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822813653, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2592, "num_deletes": 251, "total_data_size": 5168477, "memory_usage": 5242656, "flush_reason": "Manual Compaction"}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822851262, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3372780, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7000, "largest_seqno": 9587, "table_properties": {"data_size": 3362979, "index_size": 5655, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 25988, "raw_average_key_size": 21, "raw_value_size": 3340744, "raw_average_value_size": 2765, "num_data_blocks": 251, "num_entries": 1208, "num_filter_entries": 1208, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917647, "oldest_key_time": 1768917647, "file_creation_time": 1768917822, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 37797 microseconds, and 11542 cpu microseconds.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.851297) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3372780 bytes OK
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.851313) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853077) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853089) EVENT_LOG_v1 {"time_micros": 1768917822853085, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.853105) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5156418, prev total WAL file size 5156418, number of live WAL files 2.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.854183) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3293KB)], [15(7541KB)]
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822854212, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11095206, "oldest_snapshot_seqno": -1}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3799 keys, 9488793 bytes, temperature: kUnknown
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822922969, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9488793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9457817, "index_size": 20370, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9541, "raw_key_size": 91677, "raw_average_key_size": 24, "raw_value_size": 9383694, "raw_average_value_size": 2470, "num_data_blocks": 890, "num_entries": 3799, "num_filter_entries": 3799, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768917822, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.923176) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9488793 bytes
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.939092) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.2 rd, 137.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.4 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(6.1) write-amplify(2.8) OK, records in: 4322, records dropped: 523 output_compression: NoCompression
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.939148) EVENT_LOG_v1 {"time_micros": 1768917822939127, "job": 6, "event": "compaction_finished", "compaction_time_micros": 68827, "compaction_time_cpu_micros": 20181, "output_level": 6, "num_output_files": 1, "total_output_size": 9488793, "num_input_records": 4322, "num_output_records": 3799, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822940609, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917822943646, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.854135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.943720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.943726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.943728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.943730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:03:42.943732) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:03:43 np0005588919 python3.9[102112]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:03:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:43.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.a scrub starts
Jan 20 09:03:43 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.a scrub ok
Jan 20 09:03:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:44.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:44 np0005588919 python3.9[102268]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:45 np0005588919 python3.9[102443]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:46 np0005588919 python3.9[102521]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.77r2o6ds recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:47 np0005588919 python3.9[102673]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:47 np0005588919 python3.9[102751]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.87u40_y4 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:48.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:48 np0005588919 python3.9[102903]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:49 np0005588919 python3.9[103055]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:49.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 20 09:03:49 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 20 09:03:50 np0005588919 python3.9[103133]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:50.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.f deep-scrub starts
Jan 20 09:03:50 np0005588919 python3.9[103285]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:50 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.f deep-scrub ok
Jan 20 09:03:51 np0005588919 python3.9[103363]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:03:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:51.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:51 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Jan 20 09:03:51 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Jan 20 09:03:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:52.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:52 np0005588919 python3.9[103515]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Jan 20 09:03:52 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Jan 20 09:03:53 np0005588919 python3.9[103668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:53.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:53 np0005588919 python3.9[103746]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:54.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:54 np0005588919 python3.9[103898]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:55 np0005588919 python3.9[103976]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:03:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:55.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:03:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:56.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:03:56 np0005588919 python3.9[104128]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:03:56 np0005588919 systemd[1]: Reloading.
Jan 20 09:03:56 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:03:56 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:03:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:03:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:57.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:03:57 np0005588919 python3.9[104317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:58 np0005588919 python3.9[104395]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:03:58.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:03:59 np0005588919 python3.9[104547]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:03:59 np0005588919 python3.9[104625]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:03:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:03:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:03:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:03:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:00.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:00 np0005588919 python3.9[104777]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:04:00 np0005588919 systemd[1]: Reloading.
Jan 20 09:04:00 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:04:00 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:04:01 np0005588919 systemd[1]: Starting Create netns directory...
Jan 20 09:04:01 np0005588919 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:04:01 np0005588919 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:04:01 np0005588919 systemd[1]: Finished Create netns directory.
Jan 20 09:04:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:01.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Jan 20 09:04:01 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Jan 20 09:04:01 np0005588919 python3.9[104968]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:04:02 np0005588919 network[104985]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:04:02 np0005588919 network[104986]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:04:02 np0005588919 network[104987]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:04:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:02.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:03 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 20 09:04:03 np0005588919 ceph-osd[79119]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 20 09:04:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:04.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:05.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:06.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:07.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:08.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:09 np0005588919 python3.9[105382]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:09.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:09 np0005588919 python3.9[105460]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:04:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:10.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:11 np0005588919 python3.9[105612]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:11.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:12 np0005588919 python3.9[105764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:12.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:12 np0005588919 python3.9[105842]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:13.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:14.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:14 np0005588919 python3.9[105996]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 20 09:04:14 np0005588919 systemd[1]: Starting Time & Date Service...
Jan 20 09:04:14 np0005588919 systemd[1]: Started Time & Date Service.
Jan 20 09:04:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:04:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:15.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:04:15 np0005588919 python3.9[106152]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:16.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:04:16 np0005588919 python3.9[106304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:17 np0005588919 python3.9[106432]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:17.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:18 np0005588919 python3.9[106584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:18.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:18 np0005588919 python3.9[106663]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.o89gzocf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:19 np0005588919 python3.9[106815]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:19.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:19 np0005588919 python3.9[106893]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:20.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:20 np0005588919 python3.9[107045]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:21.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:21 np0005588919 python3[107198]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:04:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:22.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:22 np0005588919 python3.9[107350]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:23 np0005588919 python3.9[107428]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:23.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:04:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:24.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:04:24 np0005588919 python3.9[107580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:25 np0005588919 python3.9[107705]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917863.78342-900-128043648860379/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:25.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:26 np0005588919 python3.9[107857]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:26.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:26 np0005588919 python3.9[107935]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:27 np0005588919 python3.9[108087]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:27.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:27 np0005588919 python3.9[108167]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:28.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:28 np0005588919 python3.9[108319]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:29 np0005588919 python3.9[108397]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:29.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:30.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:30 np0005588919 python3.9[108549]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:31 np0005588919 python3.9[108704]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:31.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:32 np0005588919 python3.9[108856]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:32.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:32 np0005588919 python3.9[109008]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:33.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:33 np0005588919 python3.9[109160]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 09:04:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:34.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:34 np0005588919 python3.9[109312]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 20 09:04:35 np0005588919 systemd[1]: session-39.scope: Deactivated successfully.
Jan 20 09:04:35 np0005588919 systemd[1]: session-39.scope: Consumed 34.593s CPU time.
Jan 20 09:04:35 np0005588919 systemd-logind[783]: Session 39 logged out. Waiting for processes to exit.
Jan 20 09:04:35 np0005588919 systemd-logind[783]: Removed session 39.
Jan 20 09:04:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:35.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:04:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:36.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:04:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:04:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:37.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:04:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:38.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:39.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:40.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:40 np0005588919 systemd-logind[783]: New session 40 of user zuul.
Jan 20 09:04:40 np0005588919 systemd[1]: Started Session 40 of User zuul.
Jan 20 09:04:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:41.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:41 np0005588919 python3.9[109492]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 20 09:04:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:42.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:43 np0005588919 python3.9[109644]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:04:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:43.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:44 np0005588919 python3.9[109798]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 20 09:04:44 np0005588919 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 20 09:04:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:45.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:45 np0005588919 python3.9[109952]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.hvi5p92_ follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:04:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:46.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:46 np0005588919 python3.9[110077]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.hvi5p92_ mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917884.6058173-108-241460058828216/.source.hvi5p92_ _original_basename=.v17qguwk follow=False checksum=309fed797bdebad351617b1a1ea9eb224966ee92 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:47.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:47 np0005588919 python3.9[110229]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.060283) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888060350, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 870, "num_deletes": 250, "total_data_size": 1771781, "memory_usage": 1798392, "flush_reason": "Manual Compaction"}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888185120, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 757928, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9593, "largest_seqno": 10457, "table_properties": {"data_size": 754473, "index_size": 1235, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8879, "raw_average_key_size": 20, "raw_value_size": 747181, "raw_average_value_size": 1690, "num_data_blocks": 54, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917823, "oldest_key_time": 1768917823, "file_creation_time": 1768917888, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 124887 microseconds, and 3095 cpu microseconds.
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:04:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:48.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.185168) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 757928 bytes OK
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.185188) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.310387) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.310453) EVENT_LOG_v1 {"time_micros": 1768917888310423, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.310473) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1767301, prev total WAL file size 1767301, number of live WAL files 2.
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.311252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(740KB)], [18(9266KB)]
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888311321, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10246721, "oldest_snapshot_seqno": -1}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3753 keys, 7633247 bytes, temperature: kUnknown
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888500982, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7633247, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7605484, "index_size": 17285, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9413, "raw_key_size": 91249, "raw_average_key_size": 24, "raw_value_size": 7534954, "raw_average_value_size": 2007, "num_data_blocks": 754, "num_entries": 3753, "num_filter_entries": 3753, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768917888, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.501331) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7633247 bytes
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.503530) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 54.0 rd, 40.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.0 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(23.6) write-amplify(10.1) OK, records in: 4241, records dropped: 488 output_compression: NoCompression
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.503565) EVENT_LOG_v1 {"time_micros": 1768917888503549, "job": 8, "event": "compaction_finished", "compaction_time_micros": 189763, "compaction_time_cpu_micros": 34924, "output_level": 6, "num_output_files": 1, "total_output_size": 7633247, "num_input_records": 4241, "num_output_records": 3753, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888504032, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768917888507624, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.311160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.507716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.507726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.507730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.507734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:04:48.507739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:04:48 np0005588919 python3.9[110381]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrCUasX8PhlctvvIb2eE6+Z0hELmfczQ6UoBD+mPtCobptr/s786JmwJ3D8nIoKhlCLVSmhRfbqf1Pm45RUPTEtSuaa6HBDy40dZhTXU34X4KbGfKmur2bp9S/1w83ArKvI8inSqqk2qoMx1l7ECkEgeT+GbFwKfYLnbq5OV4Ms3tzl/uFUC/Xzxs2dbXlhozQiSamcO/a6EObErTvR8PrtaOoLFtTiD/I+oN+rkdBPkBc6r0qT4jS7nU1FOlT96meSZHE7Q1n8pxcy9PEc8w9hFdd1Zj8/WcGIdeEJsekuouK1Lut/sofQLZHyUMWJTcnBjx8BsjGx9NjUHPYUWIw+DZo7lT2QurAPNnaX4rp9ciGV2Bdm3ylNoOu3izNvM1JGTw3xRyYrmyxyWv3Euc35JXa0w07Xrqr+6Ckih0WTLU6q3Rlnrc/grpDC821sHrsljerHipJVOCbZB39LvV6wDDBlqfYZzfqID3dIqlVli4eL12J0K7jr7QAlPRhNf0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOG07miJwhzuA/nm0wvGIorydl2xbBiiDhE7PypnJ/jC#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKiJpWtps/bRsuEHfak4zDuqPHKOWFLaEA2h86H7tPlrZHR8okAVZWCmY7keO3Ad1DFyffUtJPKv5OvTK91xGO8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/dEYtIJ/delwiq9xMMctU8myoGU/TKMiFUM+i3BSaGKrC0rujad6qo1LAtjth5aYbBcgBhxy0UEX0oCruQQgc5qDpPmWHmJiAwdQJaDu6GxTRl3PlXF2u4rd0Rz72DAMuCxPSYedeHU91uL4vlrcD95xONWew2wa9lUuqQWdgj8DtqnB9T895BihDk9vFLXAaoGJcYZVGKJmXR8sOzNTFQxefqstVO0/dfbRUyFd0Ukp5v7rTmLxw0Np5WcGMOg9l/iRzWTopxnTRvXpBoGlFCmzNvTG2uH08dJ4FU5Wk9/iSxonuiVJu9DKs8Tp4EajaA4Y6cEuZiMhhqi7vw6zVCQuCmRBpny6Ub1Ag2CesMYgxwOVJO5cHsKh3BzuPFsh1gMgrrZK7v+qfm2r1rhHlPsCWrcnrtUIZa7gyzdFvHytTh/4uyGMgNpbwxkyCxgSN4PleQy2wvxy/DFW+JxCDzI4jK9LFH5aojzEhUtj+P3E7CXL/wRPxDJdfEU6PhTk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEa1zL0TUD00vr72wZq3y4rgtSnctWBvs+gME/0/EAsV#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI2WwWe4rQW0CaFwcmci1J5n144T87fcxCH+Y2CVZd5XQ7Cvzlhh1cGNDX81Tng3KgxvKOuz3mdiSCLqx8noiD0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/3N9PJZXpat0uFh2x2RoV9B0Ih74HU9CPf+g/5HncM7gCVvCpW3CBde1qNDRU2iY9rzpOVPwzi4YzoAUcxB5KAiqZOI9ylmzfiD8JXQ+myLmIRLxHOdXFaEQ4mMp4W+X37hCZ6sdfm6Yqd6eqBuZrM/72ltYoewWBNCG/Hgqzu30L9WC4+BF+iADHT7Qnmvh/cc9U71WxB4h2ikBo1SdGoFCqoez7ajitqx+dw7VWaOtEPliS0LZuDtN3Zt/cBBgxhb/FaAEI3jRP2ej9X0NJW91YxzBygyxiVasslx92g/GmnDFOWVZb5ai/JJsNH6pLTjs25IzvnuWIf8/ZLgZ03zziR4mBLP12CIVF8g1CzaqK1IILDKkjS/dzDiTBefmiQ2+N0i5EEXOgmxchqOqTkFPQg/ar0+0uBPkwzAI0HDk99czhyYHFlO+PhnULVkL1z+XLwHBgOrbNNVQQcJCvady4Gadh66mu1UrLpryNYOgZiugZi67Biha4ZPzPHok=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF41dx3BXAuEvQwQNtbUM7rIrbaOLr5CRvYNdDD+UMr9#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENFrTpm22/xEaEJMzd7C5WyJttJdK+HK5kxP8/NuvvAQSlLtEulBZnvD/OX5hk3/sDYhPQelj3YsNX1Plw5PJQ=#012 create=True mode=0644 path=/tmp/ansible.hvi5p92_ state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:49.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:49 np0005588919 python3.9[110533]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.hvi5p92_' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:04:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:04:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:50.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:04:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:50 np0005588919 python3.9[110687]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.hvi5p92_ state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:04:51 np0005588919 systemd[1]: session-40.scope: Deactivated successfully.
Jan 20 09:04:51 np0005588919 systemd[1]: session-40.scope: Consumed 6.281s CPU time.
Jan 20 09:04:51 np0005588919 systemd-logind[783]: Session 40 logged out. Waiting for processes to exit.
Jan 20 09:04:51 np0005588919 systemd-logind[783]: Removed session 40.
Jan 20 09:04:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:51.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:52.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:04:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:53.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:04:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:54.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:04:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:55.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:56 np0005588919 systemd-logind[783]: New session 41 of user zuul.
Jan 20 09:04:56 np0005588919 systemd[1]: Started Session 41 of User zuul.
Jan 20 09:04:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:56.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:57 np0005588919 python3.9[110868]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:04:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:57.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:04:58.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:58 np0005588919 python3.9[111024]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 09:04:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:04:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:04:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:04:59.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:04:59 np0005588919 python3.9[111178]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:05:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:00 np0005588919 python3.9[111331]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:05:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:01.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:01 np0005588919 python3.9[111485]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:02 np0005588919 python3.9[111637]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:03 np0005588919 systemd[1]: session-41.scope: Deactivated successfully.
Jan 20 09:05:03 np0005588919 systemd[1]: session-41.scope: Consumed 4.663s CPU time.
Jan 20 09:05:03 np0005588919 systemd-logind[783]: Session 41 logged out. Waiting for processes to exit.
Jan 20 09:05:03 np0005588919 systemd-logind[783]: Removed session 41.
Jan 20 09:05:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:03.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:04.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:05.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:06.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:07.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:08.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:08 np0005588919 systemd-logind[783]: New session 42 of user zuul.
Jan 20 09:05:08 np0005588919 systemd[1]: Started Session 42 of User zuul.
Jan 20 09:05:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:09.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:09 np0005588919 python3.9[111815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:05:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:10.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:10 np0005588919 python3.9[111971]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:05:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:11.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:11 np0005588919 python3.9[112055]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 20 09:05:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:12.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:13.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:13 np0005588919 python3.9[112206]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:05:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:15 np0005588919 python3.9[112357]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:05:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:15.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:16 np0005588919 python3.9[112507]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:16.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:16 np0005588919 python3.9[112657]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:05:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:17.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:17 np0005588919 systemd[1]: session-42.scope: Deactivated successfully.
Jan 20 09:05:17 np0005588919 systemd[1]: session-42.scope: Consumed 6.210s CPU time.
Jan 20 09:05:17 np0005588919 systemd-logind[783]: Session 42 logged out. Waiting for processes to exit.
Jan 20 09:05:17 np0005588919 systemd-logind[783]: Removed session 42.
Jan 20 09:05:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:18.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:19.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:20.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:21.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:21 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:05:21 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:21 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:05:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:22.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:23 np0005588919 systemd-logind[783]: New session 43 of user zuul.
Jan 20 09:05:23 np0005588919 systemd[1]: Started Session 43 of User zuul.
Jan 20 09:05:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:23.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:24.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:24 np0005588919 python3.9[112966]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:05:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:25.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:26 np0005588919 python3.9[113122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:26.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:26 np0005588919 python3.9[113274]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:27 np0005588919 python3.9[113476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:27.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:05:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:28.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:28 np0005588919 python3.9[113599]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917926.879435-154-126318282434116/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=cdf5e5ce161bbd2dd6884aca648bc9e5c8959a7f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:29 np0005588919 python3.9[113751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:29.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:29 np0005588919 python3.9[113874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917928.7250347-154-209960238308873/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=3f5ad343c2ed5cd826e6179427db625573e3eee3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:30.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:30 np0005588919 python3.9[114026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:31 np0005588919 python3.9[114149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917930.0562701-154-171706720860005/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=17aceafee0190f04852bee20eb6589c1f68490ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:31.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:32 np0005588919 python3.9[114301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:32.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:32 np0005588919 python3.9[114453]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:33 np0005588919 python3.9[114605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:33.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:34.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:34 np0005588919 python3.9[114728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917933.0500395-338-6886169923923/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=46a102fd528a2f124a62aab2927298bdecb62ab2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:35 np0005588919 python3.9[114880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:35 np0005588919 python3.9[115003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917934.563001-338-45907579591933/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=dbd41a175def1218d1038733ac1d1fb38abc7be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:35.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:36 np0005588919 python3.9[115155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:37 np0005588919 python3.9[115278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917935.8038375-338-267169653375519/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=fad6f80cf7b7bf75325b82443f1844e21070ed95 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:37.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:37 np0005588919 python3.9[115430]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:38.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:38 np0005588919 python3.9[115582]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:39 np0005588919 python3.9[115734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:39.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:40 np0005588919 python3.9[115857]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917938.8323934-524-199955351963314/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=61b1b75568b1bbf2c537624a8c374194b06efdf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:40.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:40 np0005588919 python3.9[116009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:41 np0005588919 python3.9[116132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917940.2346165-524-133499690949212/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=dbd41a175def1218d1038733ac1d1fb38abc7be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:41.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:42 np0005588919 python3.9[116284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:42.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:42 np0005588919 python3.9[116407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917941.633193-524-228759885909605/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=1f5c11b1c9c0e7d13b1d17638556baf1ac5ee1b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:43.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:44 np0005588919 python3.9[116559]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:44.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:44 np0005588919 python3.9[116711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:45 np0005588919 python3.9[116834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917944.40919-738-260631789454484/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:45.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:46.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:46 np0005588919 python3.9[116986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:47 np0005588919 python3.9[117138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:47.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:47 np0005588919 python3.9[117261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917946.7773268-813-261929788429138/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:48 np0005588919 python3.9[117413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:49 np0005588919 python3.9[117565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:49.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:50 np0005588919 python3.9[117688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917949.1093905-888-26214285001874/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:51 np0005588919 python3.9[117840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:51.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:51 np0005588919 python3.9[117992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:05:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:52.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:05:52 np0005588919 python3.9[118115]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917951.448817-960-143006293386587/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:53 np0005588919 python3.9[118267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:53.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:54 np0005588919 python3.9[118419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:05:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:54.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:05:54 np0005588919 python3.9[118542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917953.6655827-1033-2638052903708/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:05:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:55.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:55 np0005588919 python3.9[118694]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:05:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:56.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:56 np0005588919 python3.9[118846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:05:57 np0005588919 python3.9[118969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917956.1103356-1093-147307258798727/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a9ac548cf1fa241f1d1335913ca73d2a10501b24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:05:57 np0005588919 systemd[1]: session-43.scope: Deactivated successfully.
Jan 20 09:05:57 np0005588919 systemd[1]: session-43.scope: Consumed 27.058s CPU time.
Jan 20 09:05:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:57.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:57 np0005588919 systemd-logind[783]: Session 43 logged out. Waiting for processes to exit.
Jan 20 09:05:57 np0005588919 systemd-logind[783]: Removed session 43.
Jan 20 09:05:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:05:58.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:05:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:05:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:05:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:05:59.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:00.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:01.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:02.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:02 np0005588919 systemd-logind[783]: New session 44 of user zuul.
Jan 20 09:06:02 np0005588919 systemd[1]: Started Session 44 of User zuul.
Jan 20 09:06:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:03.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:03 np0005588919 python3.9[119149]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:04.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:04 np0005588919 python3.9[119301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:05 np0005588919 python3.9[119424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917964.0934212-63-218274905383054/.source.conf _original_basename=ceph.conf follow=False checksum=906e2ddae7738a5e2d5bcdd5b659f6884e758b17 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:05.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:06.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:06 np0005588919 python3.9[119576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:07 np0005588919 python3.9[119699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768917965.7453165-63-108960511882782/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=ddae6cb53c02baaa87ed0e28941db377a2638775 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:07 np0005588919 systemd[1]: session-44.scope: Deactivated successfully.
Jan 20 09:06:07 np0005588919 systemd[1]: session-44.scope: Consumed 3.525s CPU time.
Jan 20 09:06:07 np0005588919 systemd-logind[783]: Session 44 logged out. Waiting for processes to exit.
Jan 20 09:06:07 np0005588919 systemd-logind[783]: Removed session 44.
Jan 20 09:06:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:07.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:08.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:09.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:10.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:11.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:12.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:13 np0005588919 systemd-logind[783]: New session 45 of user zuul.
Jan 20 09:06:13 np0005588919 systemd[1]: Started Session 45 of User zuul.
Jan 20 09:06:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:13.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:14.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:14 np0005588919 python3.9[119877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:15.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:16.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:16 np0005588919 python3.9[120033]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:17 np0005588919 python3.9[120185]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:17.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:18.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:18 np0005588919 python3.9[120335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:19.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:19 np0005588919 python3.9[120487]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 20 09:06:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:20.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:21.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:22 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 20 09:06:22 np0005588919 python3.9[120643]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:06:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:22.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:23 np0005588919 python3.9[120727]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:06:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:23.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:24.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:25 np0005588919 python3.9[120880]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:06:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:25.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:26.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:26 np0005588919 python3[121035]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 20 09:06:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:27.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:27 np0005588919 python3.9[121287]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:28 np0005588919 python3.9[121470]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:06:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:06:29 np0005588919 python3.9[121548]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:29.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:30 np0005588919 python3.9[121700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:30.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:30 np0005588919 python3.9[121778]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xptdz28b recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:31 np0005588919 python3.9[121930]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:31.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:32 np0005588919 python3.9[122008]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:32.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:32 np0005588919 python3.9[122160]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:33.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:34.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:34 np0005588919 python3[122313]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:06:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:35 np0005588919 python3.9[122465]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:35.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:36 np0005588919 python3.9[122640]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917994.8578079-432-24955069834980/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:06:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:36.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:37 np0005588919 python3.9[122792]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:37 np0005588919 python3.9[122917]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917996.5200815-477-135677632120386/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:06:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:37.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:06:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:38.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:38 np0005588919 python3.9[123069]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:39 np0005588919 python3.9[123194]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917998.2223763-522-92709814147886/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:39.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:40 np0005588919 python3.9[123346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:40.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:41 np0005588919 python3.9[123471]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768917999.7761247-567-160315442882850/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:42 np0005588919 python3.9[123623]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:42.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:42 np0005588919 python3.9[123748]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918001.4105432-612-49106105279695/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:43 np0005588919 python3.9[123900]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:43.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:44.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:44 np0005588919 python3.9[124052]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:45 np0005588919 python3.9[124207]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:45.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:46.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:46 np0005588919 python3.9[124359]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:47 np0005588919 python3.9[124512]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:06:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:47.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:48 np0005588919 python3.9[124666]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:48.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:49 np0005588919 python3.9[124821]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:49.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:50.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:50 np0005588919 python3.9[124971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:06:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:51.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:51 np0005588919 python3.9[125124]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:51 np0005588919 ovs-vsctl[125125]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 20 09:06:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:52.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:52 np0005588919 python3.9[125277]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:53 np0005588919 python3.9[125432]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:06:53 np0005588919 ovs-vsctl[125434]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 20 09:06:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:53.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:06:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:06:54 np0005588919 python3.9[125585]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:06:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:06:55 np0005588919 python3.9[125739]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:55.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:56.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:56 np0005588919 python3.9[125891]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:57 np0005588919 python3.9[125969]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:57.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:58 np0005588919 python3.9[126121]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:06:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:06:58.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:06:58 np0005588919 python3.9[126199]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:06:59 np0005588919 python3.9[126351]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:06:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:06:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:06:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:06:59.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:00 np0005588919 python3.9[126503]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:00.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:00 np0005588919 python3.9[126581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:01 np0005588919 python3.9[126733]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:01.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:02 np0005588919 python3.9[126811]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:02.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:02 np0005588919 python3.9[126963]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:02 np0005588919 systemd[1]: Reloading.
Jan 20 09:07:02 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:03 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:03.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:04 np0005588919 python3.9[127153]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:04.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:04 np0005588919 python3.9[127231]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:05 np0005588919 python3.9[127383]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:05.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:05 np0005588919 python3.9[127463]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:06.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:06 np0005588919 python3.9[127615]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:06 np0005588919 systemd[1]: Reloading.
Jan 20 09:07:07 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:07 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:07 np0005588919 systemd[1]: Starting Create netns directory...
Jan 20 09:07:07 np0005588919 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:07:07 np0005588919 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:07:07 np0005588919 systemd[1]: Finished Create netns directory.
Jan 20 09:07:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:07.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:08.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:08 np0005588919 python3.9[127808]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:09 np0005588919 python3.9[127960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:09 np0005588919 python3.9[128083]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918028.713924-1365-273573758594560/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:10.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:11 np0005588919 python3.9[128235]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:11.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:12 np0005588919 python3.9[128387]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:12.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:12 np0005588919 python3.9[128539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:13 np0005588919 python3.9[128662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918032.2806437-1464-99481396404226/.source.json _original_basename=.9g9nlfds follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:14.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:14 np0005588919 python3.9[128812]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:15.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:16.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:17 np0005588919 python3.9[129235]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 20 09:07:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:17.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:18.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:18 np0005588919 python3.9[129387]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:07:19 np0005588919 python3[129539]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:07:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:19.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:20.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:21.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:07:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5984 writes, 25K keys, 5984 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5984 writes, 915 syncs, 6.54 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5984 writes, 25K keys, 5984 commit groups, 1.0 writes per commit group, ingest: 19.05 MB, 0.03 MB/s#012Interval WAL: 5984 writes, 915 syncs, 6.54 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtab
Jan 20 09:07:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:22.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:23.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:24.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:25 np0005588919 podman[129553]: 2026-01-20 14:07:25.367090338 +0000 UTC m=+5.514869966 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:25 np0005588919 podman[129674]: 2026-01-20 14:07:25.539406305 +0000 UTC m=+0.063909797 container create 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 20 09:07:25 np0005588919 podman[129674]: 2026-01-20 14:07:25.50864365 +0000 UTC m=+0.033147142 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:25 np0005588919 python3[129539]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 20 09:07:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:25.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:26.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:26 np0005588919 python3.9[129864]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:07:27 np0005588919 python3.9[130018]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:27.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:28 np0005588919 python3.9[130094]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:07:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:28.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:28 np0005588919 python3.9[130245]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918048.2091832-1698-10924979346349/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:29 np0005588919 python3.9[130321]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:07:29 np0005588919 systemd[1]: Reloading.
Jan 20 09:07:29 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:29 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:29.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:30.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:30 np0005588919 python3.9[130433]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:07:30 np0005588919 systemd[1]: Reloading.
Jan 20 09:07:30 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:30 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:30 np0005588919 systemd[1]: Starting ovn_controller container...
Jan 20 09:07:31 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.053064) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 20 09:07:31 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b3fd958b4440e28ef79052e97d3fa58723aa03456d5a15e11186b1551eb9205/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051053186, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1702, "num_deletes": 251, "total_data_size": 4215090, "memory_usage": 4271720, "flush_reason": "Manual Compaction"}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051075137, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2763657, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10462, "largest_seqno": 12159, "table_properties": {"data_size": 2756564, "index_size": 4164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 13986, "raw_average_key_size": 19, "raw_value_size": 2742491, "raw_average_value_size": 3793, "num_data_blocks": 188, "num_entries": 723, "num_filter_entries": 723, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917888, "oldest_key_time": 1768917888, "file_creation_time": 1768918051, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 22130 microseconds, and 7980 cpu microseconds.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.075203) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2763657 bytes OK
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.075228) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.078048) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.078073) EVENT_LOG_v1 {"time_micros": 1768918051078066, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.078099) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4207365, prev total WAL file size 4207365, number of live WAL files 2.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.080122) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2698KB)], [21(7454KB)]
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051080194, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10396904, "oldest_snapshot_seqno": -1}
Jan 20 09:07:31 np0005588919 systemd[1]: Started /usr/bin/podman healthcheck run 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f.
Jan 20 09:07:31 np0005588919 podman[130475]: 2026-01-20 14:07:31.114614773 +0000 UTC m=+0.193940202 container init 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + sudo -E kolla_set_configs
Jan 20 09:07:31 np0005588919 podman[130475]: 2026-01-20 14:07:31.147621901 +0000 UTC m=+0.226947300 container start 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 3959 keys, 8245471 bytes, temperature: kUnknown
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051176799, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8245471, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8216579, "index_size": 17902, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9925, "raw_key_size": 96167, "raw_average_key_size": 24, "raw_value_size": 8142578, "raw_average_value_size": 2056, "num_data_blocks": 773, "num_entries": 3959, "num_filter_entries": 3959, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918051, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.178066) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8245471 bytes
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.179919) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 107.2 rd, 85.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 7.3 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(6.7) write-amplify(3.0) OK, records in: 4476, records dropped: 517 output_compression: NoCompression
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.179952) EVENT_LOG_v1 {"time_micros": 1768918051179941, "job": 10, "event": "compaction_finished", "compaction_time_micros": 97003, "compaction_time_cpu_micros": 37849, "output_level": 6, "num_output_files": 1, "total_output_size": 8245471, "num_input_records": 4476, "num_output_records": 3959, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:07:31 np0005588919 edpm-start-podman-container[130475]: ovn_controller
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051180596, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918051182024, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.079994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.182262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.182270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.182273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.182275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:07:31.182281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:07:31 np0005588919 systemd[1]: Created slice User Slice of UID 0.
Jan 20 09:07:31 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 20 09:07:31 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 20 09:07:31 np0005588919 systemd[1]: Starting User Manager for UID 0...
Jan 20 09:07:31 np0005588919 edpm-start-podman-container[130474]: Creating additional drop-in dependency for "ovn_controller" (72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f)
Jan 20 09:07:31 np0005588919 podman[130497]: 2026-01-20 14:07:31.263450833 +0000 UTC m=+0.095065123 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:07:31 np0005588919 systemd[1]: 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f-18ad1359c7a88a0c.service: Main process exited, code=exited, status=1/FAILURE
Jan 20 09:07:31 np0005588919 systemd[1]: 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f-18ad1359c7a88a0c.service: Failed with result 'exit-code'.
Jan 20 09:07:31 np0005588919 systemd[1]: Reloading.
Jan 20 09:07:31 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:07:31 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:07:31 np0005588919 systemd[130528]: Queued start job for default target Main User Target.
Jan 20 09:07:31 np0005588919 systemd[130528]: Created slice User Application Slice.
Jan 20 09:07:31 np0005588919 systemd[130528]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 20 09:07:31 np0005588919 systemd[130528]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:07:31 np0005588919 systemd[130528]: Reached target Paths.
Jan 20 09:07:31 np0005588919 systemd[130528]: Reached target Timers.
Jan 20 09:07:31 np0005588919 systemd[130528]: Starting D-Bus User Message Bus Socket...
Jan 20 09:07:31 np0005588919 systemd[130528]: Starting Create User's Volatile Files and Directories...
Jan 20 09:07:31 np0005588919 systemd[130528]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:07:31 np0005588919 systemd[130528]: Reached target Sockets.
Jan 20 09:07:31 np0005588919 systemd[130528]: Finished Create User's Volatile Files and Directories.
Jan 20 09:07:31 np0005588919 systemd[130528]: Reached target Basic System.
Jan 20 09:07:31 np0005588919 systemd[130528]: Reached target Main User Target.
Jan 20 09:07:31 np0005588919 systemd[130528]: Startup finished in 144ms.
Jan 20 09:07:31 np0005588919 systemd[1]: Started User Manager for UID 0.
Jan 20 09:07:31 np0005588919 systemd[1]: Started ovn_controller container.
Jan 20 09:07:31 np0005588919 systemd[1]: Started Session c1 of User root.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: INFO:__main__:Validating config file
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: INFO:__main__:Writing out command to execute
Jan 20 09:07:31 np0005588919 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: ++ cat /run_command
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + ARGS=
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + sudo kolla_copy_cacerts
Jan 20 09:07:31 np0005588919 systemd[1]: Started Session c2 of User root.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + [[ ! -n '' ]]
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + . kolla_extend_start
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + umask 0022
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 20 09:07:31 np0005588919 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8241] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8250] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 20 09:07:31 np0005588919 kernel: br-int: entered promiscuous mode
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <warn>  [1768918051.8253] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 20 09:07:31 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8262] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8268] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8272] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:07:31Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8471] manager: (ovn-920572-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8485] manager: (ovn-367c1a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 20 09:07:31 np0005588919 systemd-udevd[130630]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:07:31 np0005588919 kernel: genev_sys_6081: entered promiscuous mode
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8820] device (genev_sys_6081): carrier: link connected
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.8824] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 20 09:07:31 np0005588919 systemd-udevd[130633]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:07:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:31.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:31 np0005588919 NetworkManager[49104]: <info>  [1768918051.9277] manager: (ovn-7c9bfe-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 20 09:07:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:32.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:32 np0005588919 python3.9[130760]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 20 09:07:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:33.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:34 np0005588919 python3.9[130912]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:34.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:34 np0005588919 python3.9[131035]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918053.5880013-1833-214174910689818/.source.yaml _original_basename=.agtvf657 follow=False checksum=aedaf657c77fb1feab67c7335f83a0d24eed0971 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:07:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:35 np0005588919 python3.9[131187]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:35 np0005588919 ovs-vsctl[131188]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 20 09:07:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:35.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:36.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:36 np0005588919 python3.9[131461]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:36 np0005588919 ovs-vsctl[131513]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 20 09:07:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:37 np0005588919 python3.9[131750]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:07:37 np0005588919 ovs-vsctl[131751]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 20 09:07:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:07:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:07:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:37.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:38 np0005588919 systemd[1]: session-45.scope: Deactivated successfully.
Jan 20 09:07:38 np0005588919 systemd[1]: session-45.scope: Consumed 1min 5.539s CPU time.
Jan 20 09:07:38 np0005588919 systemd-logind[783]: Session 45 logged out. Waiting for processes to exit.
Jan 20 09:07:38 np0005588919 systemd-logind[783]: Removed session 45.
Jan 20 09:07:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:38.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 20 09:07:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:07:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:07:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:39.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:40.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:41.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:41 np0005588919 systemd[1]: Stopping User Manager for UID 0...
Jan 20 09:07:41 np0005588919 systemd[130528]: Activating special unit Exit the Session...
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped target Main User Target.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped target Basic System.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped target Paths.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped target Sockets.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped target Timers.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:07:41 np0005588919 systemd[130528]: Closed D-Bus User Message Bus Socket.
Jan 20 09:07:41 np0005588919 systemd[130528]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:07:41 np0005588919 systemd[130528]: Removed slice User Application Slice.
Jan 20 09:07:41 np0005588919 systemd[130528]: Reached target Shutdown.
Jan 20 09:07:41 np0005588919 systemd[130528]: Finished Exit the Session.
Jan 20 09:07:41 np0005588919 systemd[130528]: Reached target Exit the Session.
Jan 20 09:07:41 np0005588919 systemd[1]: user@0.service: Deactivated successfully.
Jan 20 09:07:41 np0005588919 systemd[1]: Stopped User Manager for UID 0.
Jan 20 09:07:41 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 20 09:07:41 np0005588919 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 20 09:07:41 np0005588919 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 20 09:07:41 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 20 09:07:41 np0005588919 systemd[1]: Removed slice User Slice of UID 0.
Jan 20 09:07:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:07:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:42.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:07:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:07:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:43.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:07:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:44.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:07:44 np0005588919 systemd-logind[783]: New session 47 of user zuul.
Jan 20 09:07:44 np0005588919 systemd[1]: Started Session 47 of User zuul.
Jan 20 09:07:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:45.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:46 np0005588919 python3.9[131981]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:07:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:46.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:47 np0005588919 python3.9[132137]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:47.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:48 np0005588919 python3.9[132289]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:48.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:49 np0005588919 python3.9[132441]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:49 np0005588919 python3.9[132593]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:49.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:50.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:50 np0005588919 python3.9[132746]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:51 np0005588919 python3.9[132896]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:07:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:51.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:52.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:52 np0005588919 python3.9[133048]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 20 09:07:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:53.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:54 np0005588919 python3.9[133198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:54.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:07:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2125 writes, 12K keys, 2125 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2125 writes, 2125 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2125 writes, 12K keys, 2125 commit groups, 1.0 writes per commit group, ingest: 23.62 MB, 0.04 MB/s#012Interval WAL: 2125 writes, 2125 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     47.2      0.30              0.05         5    0.059       0      0       0.0       0.0#012  L6      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.3     80.0     67.1      0.47              0.13         4    0.118     16K   1784       0.0       0.0#012 Sum      1/0    7.86 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     49.1     59.4      0.77              0.18         9    0.085     16K   1784       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.3     49.3     59.6      0.76              0.18         8    0.095     16K   1784       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     80.0     67.1      0.47              0.13         4    0.118     16K   1784       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     47.5      0.29              0.05         4    0.073       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.8 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 308.00 MB usage: 1.29 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(62,1.13 MB,0.365681%) FilterBlock(9,53.86 KB,0.017077%) IndexBlock(9,112.48 KB,0.0356649%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:07:54 np0005588919 python3.9[133319]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918073.6171181-219-24433097268695/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:07:55 np0005588919 python3.9[133469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:07:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:07:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:55.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:07:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:56.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:56 np0005588919 python3.9[133590]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918075.2710576-264-91847368609394/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:07:57 np0005588919 python3.9[133742]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:07:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:57.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:58 np0005588919 python3.9[133826]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:07:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:07:58.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:07:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:07:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:07:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:07:59.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:00.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:01 np0005588919 python3.9[133979]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:08:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:08:01Z|00025|memory|INFO|16256 kB peak resident set size after 29.7 seconds
Jan 20 09:08:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:08:01Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 20 09:08:01 np0005588919 podman[133981]: 2026-01-20 14:08:01.48534145 +0000 UTC m=+0.133152505 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:08:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:01.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:02 np0005588919 python3.9[134158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:02.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:02 np0005588919 python3.9[134279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918081.6440496-375-153670055084101/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:03 np0005588919 python3.9[134431]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:03.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:04 np0005588919 python3.9[134552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918083.1653614-375-173155184778417/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:04.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:05 np0005588919 python3.9[134702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:05.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:06 np0005588919 python3.9[134823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918085.2148197-507-106543190817247/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:06.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:07 np0005588919 python3.9[134973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:07 np0005588919 python3.9[135094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918086.617277-507-186687113227177/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:07.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:08 np0005588919 python3.9[135244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:08:09 np0005588919 python3.9[135398]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:09.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:10 np0005588919 python3.9[135550]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:10.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:10 np0005588919 python3.9[135628]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:11 np0005588919 python3.9[135780]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:11 np0005588919 python3.9[135858]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:11.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:12.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:12 np0005588919 python3.9[136010]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:13 np0005588919 python3.9[136162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:13.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:14 np0005588919 python3.9[136240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:14.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:14 np0005588919 python3.9[136392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:15 np0005588919 python3.9[136470]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:15.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:16 np0005588919 python3.9[136622]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:08:16 np0005588919 systemd[1]: Reloading.
Jan 20 09:08:16 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:08:16 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:08:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:16.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:17 np0005588919 python3.9[136812]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:17 np0005588919 python3.9[136890]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:17.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:18.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:18 np0005588919 python3.9[137042]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:19 np0005588919 python3.9[137120]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:19.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:20.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:20 np0005588919 python3.9[137272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:08:20 np0005588919 systemd[1]: Reloading.
Jan 20 09:08:20 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:08:20 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:08:20 np0005588919 systemd[1]: Starting Create netns directory...
Jan 20 09:08:20 np0005588919 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 20 09:08:20 np0005588919 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 20 09:08:20 np0005588919 systemd[1]: Finished Create netns directory.
Jan 20 09:08:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:21.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:22 np0005588919 python3.9[137467]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:22.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:22 np0005588919 python3.9[137619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:23 np0005588919 python3.9[137744]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918102.2309034-960-242942297915627/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:23.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:24.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:24 np0005588919 python3.9[137896]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:25 np0005588919 python3.9[138048]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:08:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:25.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:26 np0005588919 python3.9[138200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:08:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:26.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:27 np0005588919 python3.9[138323]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918105.743533-1059-40802652787678/.source.json _original_basename=.8tixbcl9 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:27 np0005588919 python3.9[138473]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:08:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:27.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:28.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:29.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:30 np0005588919 python3.9[138896]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 20 09:08:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:30.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:31 np0005588919 python3.9[139048]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:08:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:31.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:32 np0005588919 podman[139096]: 2026-01-20 14:08:32.117718775 +0000 UTC m=+0.153262074 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 20 09:08:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:32.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:32 np0005588919 python3[139226]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:08:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:34.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:34.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:36.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:36.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:38.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:38.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:40.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:40.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:40 np0005588919 podman[139239]: 2026-01-20 14:08:40.529577605 +0000 UTC m=+7.734853241 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:40 np0005588919 podman[139370]: 2026-01-20 14:08:40.623429461 +0000 UTC m=+0.017585661 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:40 np0005588919 podman[139370]: 2026-01-20 14:08:40.733570649 +0000 UTC m=+0.127726859 container create 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:08:40 np0005588919 python3[139226]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:08:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:42.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:42.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:44.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:44.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:46.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:46.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:48.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:48.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:48 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 09:08:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:50.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 09:08:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:50.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 09:08:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:08:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:52.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:08:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:52.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:54.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:54.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:08:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:08:56 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 09:08:56 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 20 09:08:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:56 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 09:08:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:56.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:08:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:56.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:08:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:08:58.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:58 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 09:08:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:08:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:08:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:08:58.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:08:58 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 20 09:08:59 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:09:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:00.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:00.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:02.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:02.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:03 np0005588919 podman[139571]: 2026-01-20 14:09:03.076683939 +0000 UTC m=+0.113756302 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:09:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:04.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:04.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:06.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:06.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:07 np0005588919 python3.9[139723]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:09:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:08.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:08.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:09 np0005588919 python3.9[139877]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:10.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:10 np0005588919 python3.9[139953]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:09:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:10.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:11 np0005588919 python3.9[140105]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918150.3079674-1293-134556555735601/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:12.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:12 np0005588919 python3.9[140183]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:09:12 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:12 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:12 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:12.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:13 np0005588919 python3.9[140293]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:13 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:14 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:14.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:14 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:14 np0005588919 systemd[1]: Starting ovn_metadata_agent container...
Jan 20 09:09:14 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:09:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fd2634cd3aeaaed9ce404ebca3597e4aad63fcf2325112a68effa8e01d4ac52/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fd2634cd3aeaaed9ce404ebca3597e4aad63fcf2325112a68effa8e01d4ac52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:14 np0005588919 systemd[1]: Started /usr/bin/podman healthcheck run 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500.
Jan 20 09:09:14 np0005588919 podman[140333]: 2026-01-20 14:09:14.477703334 +0000 UTC m=+0.184932984 container init 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + sudo -E kolla_set_configs
Jan 20 09:09:14 np0005588919 podman[140333]: 2026-01-20 14:09:14.513395978 +0000 UTC m=+0.220625548 container start 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 09:09:14 np0005588919 edpm-start-podman-container[140333]: ovn_metadata_agent
Jan 20 09:09:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:14.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:14 np0005588919 edpm-start-podman-container[140332]: Creating additional drop-in dependency for "ovn_metadata_agent" (533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500)
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Validating config file
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Copying service configuration files
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Writing out command to execute
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 20 09:09:14 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: ++ cat /run_command
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + CMD=neutron-ovn-metadata-agent
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + ARGS=
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + sudo kolla_copy_cacerts
Jan 20 09:09:14 np0005588919 podman[140356]: 2026-01-20 14:09:14.639637913 +0000 UTC m=+0.108046319 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + [[ ! -n '' ]]
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + . kolla_extend_start
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: Running command: 'neutron-ovn-metadata-agent'
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + umask 0022
Jan 20 09:09:14 np0005588919 ovn_metadata_agent[140349]: + exec neutron-ovn-metadata-agent
Jan 20 09:09:14 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:14 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:14 np0005588919 systemd[1]: Started ovn_metadata_agent container.
Jan 20 09:09:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:16.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.332 140354 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.333 140354 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.333 140354 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.333 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.333 140354 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.334 140354 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.335 140354 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.336 140354 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.337 140354 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.338 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.339 140354 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.340 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.341 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.342 140354 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.343 140354 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.344 140354 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.345 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.346 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.347 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.348 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.349 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.350 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.351 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.352 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.353 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.354 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.355 140354 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.356 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.357 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.358 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.359 140354 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.360 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.361 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.362 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.363 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.364 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.365 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.366 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.367 140354 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.375 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.376 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.376 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.376 140354 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.376 140354 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.389 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 5ffd4ac3-9266-4927-98ad-20a17782c725 (UUID: 5ffd4ac3-9266-4927-98ad-20a17782c725) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.417 140354 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.417 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.417 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.417 140354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.420 140354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.425 140354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.431 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '5ffd4ac3-9266-4927-98ad-20a17782c725'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], external_ids={}, name=5ffd4ac3-9266-4927-98ad-20a17782c725, nb_cfg_timestamp=1768918059851, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.432 140354 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fb671571f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.432 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.433 140354 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.433 140354 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.433 140354 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.437 140354 DEBUG oslo_service.service [-] Started child 140461 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.440 140354 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpx9hh2_y5/privsep.sock']#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.443 140461 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-164319'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.476 140461 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.477 140461 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.477 140461 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.481 140461 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.489 140461 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 20 09:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:16.498 140461 INFO eventlet.wsgi.server [-] (140461) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 20 09:09:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:16.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:17 np0005588919 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.202 140354 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.203 140354 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpx9hh2_y5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.064 140466 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.071 140466 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.074 140466 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.075 140466 INFO oslo.privsep.daemon [-] privsep daemon running as pid 140466#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.208 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[0577f207-341a-4fd9-91db-e5923117984a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.733 140466 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.734 140466 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:09:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:17.734 140466 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:09:17 np0005588919 ceph-mds[84722]: mds.beacon.cephfs.compute-1.rtofcx missed beacon ack from the monitors
Jan 20 09:09:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:18.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:18.256 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[74b2ea12-9a7b-49ee-9deb-c56cf30bea6b]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:09:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:18.259 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, column=external_ids, values=({'neutron:ovn-metadata-id': '6a850319-0563-5f8e-b562-8d29f1112d59'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:09:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:20.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:20.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.269 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.357 140354 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.358 140354 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.359 140354 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.360 140354 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.360 140354 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.360 140354 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.360 140354 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.361 140354 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.361 140354 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.361 140354 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.361 140354 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.361 140354 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.362 140354 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.362 140354 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.362 140354 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.362 140354 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.362 140354 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.363 140354 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.364 140354 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.364 140354 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.364 140354 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.364 140354 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.365 140354 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.365 140354 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.365 140354 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.365 140354 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.365 140354 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.366 140354 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.366 140354 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.366 140354 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.366 140354 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.366 140354 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.367 140354 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.368 140354 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.369 140354 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.369 140354 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.369 140354 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.369 140354 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.369 140354 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.370 140354 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.371 140354 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.372 140354 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.373 140354 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.374 140354 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.375 140354 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.376 140354 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.377 140354 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.378 140354 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.379 140354 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.379 140354 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.379 140354 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.379 140354 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.379 140354 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.380 140354 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.380 140354 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.380 140354 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.380 140354 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.381 140354 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.382 140354 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.382 140354 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.382 140354 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.382 140354 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.382 140354 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.383 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.384 140354 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.385 140354 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.386 140354 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.387 140354 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.388 140354 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.389 140354 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.389 140354 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.389 140354 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.389 140354 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.389 140354 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.390 140354 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.391 140354 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.392 140354 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.392 140354 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.392 140354 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.392 140354 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.392 140354 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.393 140354 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.394 140354 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.395 140354 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.396 140354 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.397 140354 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.398 140354 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.398 140354 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.398 140354 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.398 140354 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.398 140354 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.399 140354 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.400 140354 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.401 140354 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.402 140354 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.403 140354 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.403 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.403 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.403 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.403 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.404 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.404 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.404 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.404 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.404 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.405 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.406 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.407 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.407 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.407 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.407 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.407 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.408 140354 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.409 140354 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.409 140354 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:09:21.409 140354 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:09:21 np0005588919 ceph-mds[84722]: mds.beacon.cephfs.compute-1.rtofcx missed beacon ack from the monitors
Jan 20 09:09:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:22.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:22.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).paxos(paxos active c 754..1350) lease_timeout -- calling new election
Jan 20 09:09:23 np0005588919 ceph-mon[81775]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 20 09:09:23 np0005588919 ceph-mon[81775]: paxos.2).electionLogic(14) init, last seen epoch 14
Jan 20 09:09:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:24.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:24.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.17671085 +0000 UTC m=+4.487387413 container create 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 systemd[1]: Started libpod-conmon-490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f.scope.
Jan 20 09:09:25 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.156394693 +0000 UTC m=+4.467071306 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.244811332 +0000 UTC m=+4.555487905 container init 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.251904424 +0000 UTC m=+4.562580997 container start 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 dazzling_cori[140727]: 0 0
Jan 20 09:09:25 np0005588919 systemd[1]: libpod-490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f.scope: Deactivated successfully.
Jan 20 09:09:25 np0005588919 conmon[140727]: conmon 490278afcdd206df9fe4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f.scope/container/memory.events
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.257953235 +0000 UTC m=+4.568629828 container attach 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.258328156 +0000 UTC m=+4.569004719 container died 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ddb69c95c9aea067452f694859e1bc66a18e102056e54319235feae920b3fa3b-merged.mount: Deactivated successfully.
Jan 20 09:09:25 np0005588919 podman[140614]: 2026-01-20 14:09:25.295310696 +0000 UTC m=+4.605987259 container remove 490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f (image=quay.io/ceph/haproxy:2.3, name=dazzling_cori)
Jan 20 09:09:25 np0005588919 systemd[1]: libpod-conmon-490278afcdd206df9fe453a8f387de564518087f326bd74add4bc55b313e910f.scope: Deactivated successfully.
Jan 20 09:09:25 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:25 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:25 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:25 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:25 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:25 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:25 np0005588919 ceph-mds[84722]: mds.beacon.cephfs.compute-1.rtofcx missed beacon ack from the monitors
Jan 20 09:09:25 np0005588919 systemd[1]: Starting Ceph haproxy.rgw.default.compute-1.uyeocq for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:26.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:26 np0005588919 podman[140873]: 2026-01-20 14:09:26.23626825 +0000 UTC m=+0.045839572 container create 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:26 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be5bb3ed10a159988414e2bbbf41e50e39bab60a2eb0bd9879d92b48cb45352f/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:26 np0005588919 podman[140873]: 2026-01-20 14:09:26.302449928 +0000 UTC m=+0.112021320 container init 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:26 np0005588919 podman[140873]: 2026-01-20 14:09:26.211659872 +0000 UTC m=+0.021231164 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 20 09:09:26 np0005588919 podman[140873]: 2026-01-20 14:09:26.307019648 +0000 UTC m=+0.116590950 container start 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:26 np0005588919 bash[140873]: 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788
Jan 20 09:09:26 np0005588919 systemd[1]: Started Ceph haproxy.rgw.default.compute-1.uyeocq for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:09:26 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq[140888]: [NOTICE] 019/140926 (2) : New worker #1 (4) forked
Jan 20 09:09:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:26.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:26.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:28.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:28.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:28.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON: #012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1514, in _run_cephadm_json#012    return json.loads(''.join(out))#012  File "/lib64/python3.9/json/__init__.py", line 346, in loads#012    return _default_decoder.decode(s)#012  File "/lib64/python3.9/json/decoder.py", line 337, in decode#012    obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012  File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012    raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: Failed to apply osd.default_drive_group spec DriveGroupSpec.from_json(yaml.safe_load('''service_type: osd#012service_id: default_drive_group#012service_name: osd.default_drive_group#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012spec:#012  data_devices:#012    paths:#012    - /dev/ceph_vg0/ceph_lv0#012  filter_logic: AND#012  objectstore: bluestore#012''')): host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON#012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1514, in _run_cephadm_json#012    return json.loads(''.join(out))#012  File "/lib64/python3.9/json/__init__.py", line 346, in loads#012    return _default_decoder.decode(s)#012  File "/lib64/python3.9/json/decoder.py", line 337, in decode#012    obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012  File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012    raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)#012#012During handling of the above exception, another exception occurred:#012#012Traceback (most recent call last):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 577, in _apply_all_services#012    if self._apply_service(spec):#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 696, in _apply_service#012    self.mgr.osd_service.create_from_spec(cast(DriveGroupSpec, spec))#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 79, in create_from_spec#012    ret = self.mgr.wait_async(all_hosts())#012  File "/usr/share/ceph/mgr/cephadm/module.py", line 735, in wait_async#012    return self.event_loop.get_result(coro, timeout)#012  File "/usr/share/ceph/mgr/cephadm/ssh.py", line 64, in get_result#012    return future.result(timeout)#012  File "/lib64/python3.9/concurrent/futures/_base.py", line 446, in result#012    return self.__get_result()#012  File "/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result#012    raise self._exception#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 76, in all_hosts#012    return await gather(*futures)#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 63, in create_from_spec_one#012    ret_msg = await self.create_single_host(#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 98, in create_single_host#012    return await self.deploy_osd_daemons_for_existing_osds(host, drive_group,#012  File "/usr/share/ceph/mgr/cephadm/services/osd.py", line 158, in deploy_osd_daemons_for_existing_osds#012    raw_elems: dict = await CephadmServe(self.mgr)._run_cephadm_json(#012  File "/usr/share/ceph/mgr/cephadm/serve.py", line 1518, in _run_cephadm_json#012    raise OrchestratorError(msg)#012orchestrator._interface.OrchestratorError: host compute-0 `cephadm ceph-volume` failed: Cannot decode JSON
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: Deploying daemon haproxy.rgw.default.compute-1.uyeocq on compute-1
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: mon.compute-1 calling monitor election
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: mon.compute-2 calling monitor election
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: mon.compute-0 calling monitor election
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 20 09:09:29 np0005588919 ceph-mon[81775]: Deploying daemon keepalived.rgw.default.compute-1.cevitz on compute-1
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:30 np0005588919 python3.9[141191]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 20 09:09:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:30.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:30 np0005588919 ceph-mon[81775]: Health check failed: Failed to apply 1 service(s): osd.default_drive_group (CEPHADM_APPLY_SPEC_FAIL)
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:31 np0005588919 python3.9[141369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:09:31 np0005588919 python3.9[141511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918170.7733998-1428-32947228289228/.source.yaml _original_basename=.xw15sl9t follow=False checksum=cdeb45300f793bd9e5b2caee7d44d83f067a1a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:32.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.19332944 +0000 UTC m=+3.006207587 container create 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, release=1793, distribution-scope=public, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.buildah.version=1.28.2, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git)
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.177630674 +0000 UTC m=+2.990508851 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 09:09:32 np0005588919 systemd[1]: Started libpod-conmon-1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa.scope.
Jan 20 09:09:32 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.288026227 +0000 UTC m=+3.100904394 container init 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1793, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.297957449 +0000 UTC m=+3.110835626 container start 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, architecture=x86_64, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., name=keepalived, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.302104907 +0000 UTC m=+3.114983094 container attach 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, description=keepalived for Ceph, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=keepalived-container, release=1793, version=2.2.4, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20)
Jan 20 09:09:32 np0005588919 affectionate_robinson[141566]: 0 0
Jan 20 09:09:32 np0005588919 systemd[1]: libpod-1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa.scope: Deactivated successfully.
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.30608421 +0000 UTC m=+3.118962357 container died 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, version=2.2.4, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, name=keepalived, vendor=Red Hat, Inc.)
Jan 20 09:09:32 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ae9ff7518ea4e04fbc4a988b3d9514cd490fbac9c477eeec28a2a6bc0aeb8d40-merged.mount: Deactivated successfully.
Jan 20 09:09:32 np0005588919 podman[141043]: 2026-01-20 14:09:32.340251339 +0000 UTC m=+3.153129486 container remove 1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa (image=quay.io/ceph/keepalived:2.2.4, name=affectionate_robinson, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 20 09:09:32 np0005588919 systemd[1]: libpod-conmon-1851a3237434bd1b3bf4c26b538cef939eb774c8b11847a2ed0ec26bfe7317aa.scope: Deactivated successfully.
Jan 20 09:09:32 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:32 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:32 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:32.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:32 np0005588919 systemd[1]: session-47.scope: Deactivated successfully.
Jan 20 09:09:32 np0005588919 systemd[1]: session-47.scope: Consumed 1min 1.537s CPU time.
Jan 20 09:09:32 np0005588919 systemd-logind[783]: Session 47 logged out. Waiting for processes to exit.
Jan 20 09:09:32 np0005588919 systemd-logind[783]: Removed session 47.
Jan 20 09:09:32 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:32 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:32 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:32.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:33 np0005588919 systemd[1]: Starting Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:09:33 np0005588919 podman[141664]: 2026-01-20 14:09:33.283662873 +0000 UTC m=+0.111583757 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:09:33 np0005588919 podman[141735]: 2026-01-20 14:09:33.34025596 +0000 UTC m=+0.036217059 container create 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.28.2, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, architecture=x86_64, version=2.2.4, vendor=Red Hat, Inc., vcs-type=git)
Jan 20 09:09:33 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1c25161a5231e80d5ac8ddda72b615c5387a44fbf34d7d20f0c56077ae85edb/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:09:33 np0005588919 podman[141735]: 2026-01-20 14:09:33.395113886 +0000 UTC m=+0.091074985 container init 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, io.openshift.tags=Ceph keepalived, release=1793, vcs-type=git, architecture=x86_64, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.buildah.version=1.28.2, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived)
Jan 20 09:09:33 np0005588919 podman[141735]: 2026-01-20 14:09:33.404554624 +0000 UTC m=+0.100515723 container start 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, io.openshift.expose-services=, name=keepalived, vendor=Red Hat, Inc., release=1793, vcs-type=git, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 20 09:09:33 np0005588919 bash[141735]: 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32
Jan 20 09:09:33 np0005588919 podman[141735]: 2026-01-20 14:09:33.324730999 +0000 UTC m=+0.020692118 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 09:09:33 np0005588919 systemd[1]: Started Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Starting VRRP child process, pid=4
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: Startup complete
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: (VI_0) Entering BACKUP STATE (init)
Jan 20 09:09:33 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:33 2026: VRRP_Script(check_backend) succeeded
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:34.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:34.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:34 np0005588919 podman[142031]: 2026-01-20 14:09:34.646179892 +0000 UTC m=+0.080048442 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Jan 20 09:09:34 np0005588919 podman[142031]: 2026-01-20 14:09:34.779560857 +0000 UTC m=+0.213429407 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:34.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:35 np0005588919 podman[142188]: 2026-01-20 14:09:35.751452449 +0000 UTC m=+0.082498072 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:35 np0005588919 podman[142188]: 2026-01-20 14:09:35.791486026 +0000 UTC m=+0.122531609 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:36 np0005588919 podman[142254]: 2026-01-20 14:09:36.085648314 +0000 UTC m=+0.069232866 container exec 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vendor=Red Hat, Inc.)
Jan 20 09:09:36 np0005588919 podman[142254]: 2026-01-20 14:09:36.10029619 +0000 UTC m=+0.083880682 container exec_died 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.buildah.version=1.28.2)
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:36.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:36.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:36.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:37 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:09:37 2026: (VI_0) Entering MASTER STATE
Jan 20 09:09:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:09:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:38.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:38 np0005588919 systemd-logind[783]: New session 48 of user zuul.
Jan 20 09:09:38 np0005588919 systemd[1]: Started Session 48 of User zuul.
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:38.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:38 np0005588919 ceph-mon[81775]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 1 service(s): osd.default_drive_group)
Jan 20 09:09:38 np0005588919 ceph-mon[81775]: Cluster is now healthy
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:38.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:39 np0005588919 python3.9[142441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:09:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:40.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:40.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:40 np0005588919 python3.9[142598]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:09:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:40.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:42.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:42 np0005588919 python3.9[142763]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:09:42 np0005588919 systemd[1]: Reloading.
Jan 20 09:09:42 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:09:42 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:09:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:42.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:43.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:43 np0005588919 python3.9[142948]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:09:43 np0005588919 network[142966]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:09:43 np0005588919 network[142967]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:09:43 np0005588919 network[142968]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:09:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:44.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.102 - anonymous [20/Jan/2026:14:09:44.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:44 np0005588919 ceph-mon[81775]: Removing daemon haproxy.rgw.default.compute-2.cuokcs from compute-2 -- ports [8080, 8999]
Jan 20 09:09:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:45.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:45 np0005588919 podman[142997]: 2026-01-20 14:09:45.051557487 +0000 UTC m=+0.125339088 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:09:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth rm", "entity": "client.ingress.rgw.default.compute-2.cuokcs"}]: dispatch
Jan 20 09:09:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:46.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:46 np0005588919 ceph-mon[81775]: Removing key for client.ingress.rgw.default.compute-2.cuokcs
Jan 20 09:09:46 np0005588919 ceph-mon[81775]: Removing daemon keepalived.rgw.default.compute-2.dleeql from compute-2 -- ports []
Jan 20 09:09:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:09:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:47.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:09:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:48.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:48 np0005588919 python3.9[143252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth rm", "entity": "client.ingress.rgw.default.compute-2.dleeql"}]: dispatch
Jan 20 09:09:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:49.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:49 np0005588919 python3.9[143605]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:49 np0005588919 ceph-mon[81775]: Removing key for client.ingress.rgw.default.compute-2.dleeql
Jan 20 09:09:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:49 np0005588919 podman[143726]: 2026-01-20 14:09:49.912030126 +0000 UTC m=+0.089856181 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 20 09:09:50 np0005588919 podman[143726]: 2026-01-20 14:09:50.012387875 +0000 UTC m=+0.190213910 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:09:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:50.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:50 np0005588919 python3.9[143884]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:50 np0005588919 podman[144055]: 2026-01-20 14:09:50.770398967 +0000 UTC m=+0.080088324 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:50 np0005588919 podman[144055]: 2026-01-20 14:09:50.782208082 +0000 UTC m=+0.091897349 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:09:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:51.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:51 np0005588919 podman[144206]: 2026-01-20 14:09:51.072737668 +0000 UTC m=+0.072101708 container exec 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, vcs-type=git, io.openshift.tags=Ceph keepalived, version=2.2.4, com.redhat.component=keepalived-container, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.buildah.version=1.28.2)
Jan 20 09:09:51 np0005588919 podman[144206]: 2026-01-20 14:09:51.094229328 +0000 UTC m=+0.093593358 container exec_died 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, name=keepalived, vcs-type=git, version=2.2.4, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Jan 20 09:09:51 np0005588919 python3.9[144203]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.361834) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191361985, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1530, "num_deletes": 257, "total_data_size": 3512167, "memory_usage": 3559760, "flush_reason": "Manual Compaction"}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191387745, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2252287, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12164, "largest_seqno": 13689, "table_properties": {"data_size": 2245739, "index_size": 3683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13582, "raw_average_key_size": 18, "raw_value_size": 2232168, "raw_average_value_size": 3121, "num_data_blocks": 166, "num_entries": 715, "num_filter_entries": 715, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918052, "oldest_key_time": 1768918052, "file_creation_time": 1768918191, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 25968 microseconds, and 10105 cpu microseconds.
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.387809) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2252287 bytes OK
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.387833) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.389697) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.389722) EVENT_LOG_v1 {"time_micros": 1768918191389715, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.389744) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3504908, prev total WAL file size 3504908, number of live WAL files 2.
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.391162) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323537' seq:0, type:0; will stop at (end)
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(2199KB)], [24(8052KB)]
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191391207, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 10497758, "oldest_snapshot_seqno": -1}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4137 keys, 9910000 bytes, temperature: kUnknown
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191487259, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9910000, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9878300, "index_size": 20262, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 101803, "raw_average_key_size": 24, "raw_value_size": 9799484, "raw_average_value_size": 2368, "num_data_blocks": 859, "num_entries": 4137, "num_filter_entries": 4137, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918191, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.487502) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9910000 bytes
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.489277) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.2 rd, 103.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 7.9 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(9.1) write-amplify(4.4) OK, records in: 4674, records dropped: 537 output_compression: NoCompression
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.489294) EVENT_LOG_v1 {"time_micros": 1768918191489284, "job": 12, "event": "compaction_finished", "compaction_time_micros": 96142, "compaction_time_cpu_micros": 33932, "output_level": 6, "num_output_files": 1, "total_output_size": 9910000, "num_input_records": 4674, "num_output_records": 4137, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191489936, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918191491159, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.391062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.491236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.491241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.491243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.491246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:09:51.491248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:52 np0005588919 python3.9[144507]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:52.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:09:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:09:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:09:52 np0005588919 python3.9[144677]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:53.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:53 np0005588919 python3.9[144831]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:09:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:54.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:55.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:09:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:56.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:56 np0005588919 python3.9[144985]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:57.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:57 np0005588919 python3.9[145137]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:57 np0005588919 python3.9[145290]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:09:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:09:58.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:09:58 np0005588919 python3.9[145442]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:09:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:09:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:09:59.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:09:59 np0005588919 python3.9[145594]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:09:59 np0005588919 python3.9[145797]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:10:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:00.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:00 np0005588919 python3.9[145949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:01.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:01 np0005588919 ceph-mon[81775]: Reconfiguring keepalived.rgw.default.compute-0.gcjsxe (dependencies changed)...
Jan 20 09:10:01 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:10:01 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:10:01 np0005588919 ceph-mon[81775]: Reconfiguring daemon keepalived.rgw.default.compute-0.gcjsxe on compute-0
Jan 20 09:10:01 np0005588919 python3.9[146102]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:02.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:02 np0005588919 python3.9[146254]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:03.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:03 np0005588919 python3.9[146489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.281493292 +0000 UTC m=+0.071174131 container create 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 20 09:10:03 np0005588919 systemd[1]: Started libpod-conmon-05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c.scope.
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.251361787 +0000 UTC m=+0.041042716 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 09:10:03 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.395546818 +0000 UTC m=+0.185227757 container init 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, version=2.2.4, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, description=keepalived for Ceph, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.41219112 +0000 UTC m=+0.201871999 container start 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, description=keepalived for Ceph, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, io.openshift.expose-services=, build-date=2023-02-22T09:23:20)
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.418035876 +0000 UTC m=+0.207716745 container attach 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, vcs-type=git, name=keepalived, com.redhat.component=keepalived-container, version=2.2.4, distribution-scope=public, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 20 09:10:03 np0005588919 quirky_cannon[146587]: 0 0
Jan 20 09:10:03 np0005588919 systemd[1]: libpod-05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c.scope: Deactivated successfully.
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.419632041 +0000 UTC m=+0.209312890 container died 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, distribution-scope=public, name=keepalived, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vendor=Red Hat, Inc., version=2.2.4, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 20 09:10:03 np0005588919 systemd[1]: var-lib-containers-storage-overlay-bd96cd56c0156c2fc010f920ec08ada05d304117ea15340123bd4fe5fddf7f64-merged.mount: Deactivated successfully.
Jan 20 09:10:03 np0005588919 podman[146531]: 2026-01-20 14:10:03.497856021 +0000 UTC m=+0.287536890 container remove 05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c (image=quay.io/ceph/keepalived:2.2.4, name=quirky_cannon, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, description=keepalived for Ceph, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.buildah.version=1.28.2)
Jan 20 09:10:03 np0005588919 systemd[1]: libpod-conmon-05e74022568c9941c63722eb3c5cf98e4e561a45e5da76bee1e1e08f5188d90c.scope: Deactivated successfully.
Jan 20 09:10:03 np0005588919 podman[146588]: 2026-01-20 14:10:03.532784093 +0000 UTC m=+0.167888945 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 20 09:10:03 np0005588919 systemd[1]: Stopping Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: Reconfiguring keepalived.rgw.default.compute-1.cevitz (dependencies changed)...
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 20 09:10:03 np0005588919 ceph-mon[81775]: Reconfiguring daemon keepalived.rgw.default.compute-1.cevitz on compute-1
Jan 20 09:10:03 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:10:03 2026: Stopping
Jan 20 09:10:03 np0005588919 python3.9[146765]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:04.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:04 np0005588919 python3.9[146932]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:04 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:10:04 2026: Stopped
Jan 20 09:10:04 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[141751]: Tue Jan 20 14:10:04 2026: Stopped Keepalived v2.2.4 (08/21,2021)
Jan 20 09:10:04 np0005588919 podman[146767]: 2026-01-20 14:10:04.827589389 +0000 UTC m=+1.076964375 container died 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., release=1793, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, name=keepalived, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 20 09:10:04 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e1c25161a5231e80d5ac8ddda72b615c5387a44fbf34d7d20f0c56077ae85edb-merged.mount: Deactivated successfully.
Jan 20 09:10:04 np0005588919 podman[146767]: 2026-01-20 14:10:04.895832816 +0000 UTC m=+1.145207832 container remove 5aefe91b7ec82d14c11ddef250a4a56008ca1c7923fa55da8e9a43986378df32 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., release=1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph)
Jan 20 09:10:04 np0005588919 bash[146767]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz
Jan 20 09:10:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:05.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:05 np0005588919 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@keepalived.rgw.default.compute-1.cevitz.service: Deactivated successfully.
Jan 20 09:10:05 np0005588919 systemd[1]: Stopped Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:10:05 np0005588919 systemd[1]: ceph-e399cf45-e6b6-5393-99f1-75c601d3f188@keepalived.rgw.default.compute-1.cevitz.service: Consumed 1.249s CPU time.
Jan 20 09:10:05 np0005588919 systemd[1]: Starting Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188...
Jan 20 09:10:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:05 np0005588919 podman[147170]: 2026-01-20 14:10:05.363937331 +0000 UTC m=+0.056498224 container create e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, description=keepalived for Ceph, distribution-scope=public, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, name=keepalived, io.buildah.version=1.28.2, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, architecture=x86_64)
Jan 20 09:10:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0b704a066dc7e5a0e85f636af2537598365003570ac0fe0f0eed25f7c708f9f/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:10:05 np0005588919 podman[147170]: 2026-01-20 14:10:05.420842886 +0000 UTC m=+0.113403819 container init e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, architecture=x86_64, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, name=keepalived, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Jan 20 09:10:05 np0005588919 podman[147170]: 2026-01-20 14:10:05.42660094 +0000 UTC m=+0.119161843 container start e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, name=keepalived, com.redhat.component=keepalived-container, version=2.2.4, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, distribution-scope=public, io.openshift.tags=Ceph keepalived)
Jan 20 09:10:05 np0005588919 bash[147170]: e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962
Jan 20 09:10:05 np0005588919 podman[147170]: 2026-01-20 14:10:05.339546969 +0000 UTC m=+0.032107902 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 20 09:10:05 np0005588919 systemd[1]: Started Ceph keepalived.rgw.default.compute-1.cevitz for e399cf45-e6b6-5393-99f1-75c601d3f188.
Jan 20 09:10:05 np0005588919 python3.9[147153]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Starting VRRP child process, pid=4
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: Startup complete
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: (VI_0) Entering BACKUP STATE (init)
Jan 20 09:10:05 np0005588919 ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz[147186]: Tue Jan 20 14:10:05 2026: VRRP_Script(check_backend) succeeded
Jan 20 09:10:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:06 np0005588919 python3.9[147471]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:10:06 np0005588919 podman[147543]: 2026-01-20 14:10:06.818722918 +0000 UTC m=+0.470687259 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 09:10:06 np0005588919 podman[147543]: 2026-01-20 14:10:06.923489592 +0000 UTC m=+0.575453963 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:10:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:07.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:07 np0005588919 python3.9[147710]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:07 np0005588919 podman[147910]: 2026-01-20 14:10:07.860531974 +0000 UTC m=+0.100056440 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:10:07 np0005588919 podman[147910]: 2026-01-20 14:10:07.87343526 +0000 UTC m=+0.112959756 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:10:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:08 np0005588919 python3.9[148020]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:10:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:08 np0005588919 podman[148041]: 2026-01-20 14:10:08.190720395 +0000 UTC m=+0.092691142 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, distribution-scope=public)
Jan 20 09:10:08 np0005588919 podman[148041]: 2026-01-20 14:10:08.236507535 +0000 UTC m=+0.138478382 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, name=keepalived, architecture=x86_64, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 20 09:10:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:09.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:09 np0005588919 python3.9[148276]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:10:09 np0005588919 systemd[1]: Reloading.
Jan 20 09:10:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:10:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:10:09 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:10:09 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:10:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:10 np0005588919 python3.9[148465]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:11.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:11 np0005588919 python3.9[148618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:11 np0005588919 python3.9[148772]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:12.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:12 np0005588919 python3.9[148925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:13 np0005588919 python3.9[149078]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:14.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:14 np0005588919 python3.9[149232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:15.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:15 np0005588919 python3.9[149385]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:10:15 np0005588919 podman[149387]: 2026-01-20 14:10:15.279346531 +0000 UTC m=+0.092421914 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:10:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:16.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:10:16.370 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:10:16.371 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:10:16.371 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:10:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:17.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:17 np0005588919 python3.9[149609]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 20 09:10:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:10:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:18.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:18 np0005588919 python3.9[149762]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:10:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:19.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:20.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:21 np0005588919 python3.9[149921]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 09:10:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:21.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:22.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:22 np0005588919 python3.9[150082]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:10:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:23.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:23 np0005588919 python3.9[150166]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:10:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:24.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:25.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:26.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:27.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:29.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:30.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:31.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:32.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:33.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:34 np0005588919 podman[150326]: 2026-01-20 14:10:34.147598482 +0000 UTC m=+0.163647282 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:10:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:34.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:35.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:36.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:37.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:39.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:40.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:41.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:42.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:43.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:44.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:45.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:46 np0005588919 podman[150444]: 2026-01-20 14:10:46.063994147 +0000 UTC m=+0.094263889 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:10:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:46.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:47.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:48.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:49.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:50.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:51.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:52.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:52 np0005588919 kernel: SELinux:  Converting 2776 SID table entries...
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:10:52 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:10:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:10:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:53.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:10:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:10:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:54.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:10:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:55.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:10:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:56.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:57.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:10:58.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:10:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:10:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:10:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:10:59.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:00.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:02.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:03.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:03 np0005588919 kernel: SELinux:  Converting 2776 SID table entries...
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:11:03 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:11:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:04.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:04 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 20 09:11:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:05.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:05 np0005588919 podman[150537]: 2026-01-20 14:11:05.170153745 +0000 UTC m=+0.170370693 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:11:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:06.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:07.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:08.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:09.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:10.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:11.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:12.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:13.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:14.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:15.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:16.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:11:16.372 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:11:16.372 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:11:16.373 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:11:17 np0005588919 podman[151361]: 2026-01-20 14:11:17.03606589 +0000 UTC m=+0.075590890 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 20 09:11:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:17.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:18.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:19.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:11:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:20.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:11:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:21.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:22.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:23.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:24.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:25.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:26.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:27.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:28.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:29.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:30.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:31.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:32.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:11:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:33.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:34.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:35.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:36 np0005588919 podman[160625]: 2026-01-20 14:11:36.157911151 +0000 UTC m=+0.169320004 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:11:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:36.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:37.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:38.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:39.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:40.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:41.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:42.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:43.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:44.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:45.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:46.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:47.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:48 np0005588919 podman[166335]: 2026-01-20 14:11:48.053962957 +0000 UTC m=+0.089217647 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 20 09:11:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:48.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:49.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:50.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:51.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:53.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:11:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:54.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:11:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:11:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:56.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:11:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:57.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:11:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:11:58.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:11:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:11:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:11:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:11:59.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:00.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:01.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:02.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:03.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:04 np0005588919 kernel: SELinux:  Converting 2777 SID table entries...
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability network_peer_controls=1
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability open_perms=1
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability extended_socket_class=1
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability always_check_network=0
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 20 09:12:04 np0005588919 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 20 09:12:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:04.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:05.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:05 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 09:12:05 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 20 09:12:05 np0005588919 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 20 09:12:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:06 np0005588919 podman[167892]: 2026-01-20 14:12:06.380218276 +0000 UTC m=+0.136227253 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:12:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:06.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:08.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:09.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:10.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:11.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:12.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:13.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:14 np0005588919 systemd[1]: Stopping OpenSSH server daemon...
Jan 20 09:12:14 np0005588919 systemd[1]: sshd.service: Deactivated successfully.
Jan 20 09:12:14 np0005588919 systemd[1]: Stopped OpenSSH server daemon.
Jan 20 09:12:14 np0005588919 systemd[1]: sshd.service: Consumed 5.778s CPU time, read 32.0K from disk, written 176.0K to disk.
Jan 20 09:12:14 np0005588919 systemd[1]: Stopped target sshd-keygen.target.
Jan 20 09:12:14 np0005588919 systemd[1]: Stopping sshd-keygen.target...
Jan 20 09:12:14 np0005588919 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:14 np0005588919 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:14 np0005588919 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 20 09:12:14 np0005588919 systemd[1]: Reached target sshd-keygen.target.
Jan 20 09:12:14 np0005588919 systemd[1]: Starting OpenSSH server daemon...
Jan 20 09:12:14 np0005588919 systemd[1]: Started OpenSSH server daemon.
Jan 20 09:12:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:14.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:15.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:16 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:12:16 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:12:16 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:12:16.373 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:12:16.374 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:12:16.374 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:12:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:16.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:16 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:16 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:16 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:12:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:17.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:18.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:19 np0005588919 podman[171625]: 2026-01-20 14:12:19.064755009 +0000 UTC m=+0.102024122 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:12:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:19.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:21.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:22.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:23.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:24.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:25.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:26.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:26 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:12:26 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:12:26 np0005588919 systemd[1]: man-db-cache-update.service: Consumed 13.456s CPU time.
Jan 20 09:12:26 np0005588919 systemd[1]: run-r63e1f19053ee4943b3154f925b07c783.service: Deactivated successfully.
Jan 20 09:12:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:27.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:27 np0005588919 python3.9[177621]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:27 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:27 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:27 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:28.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:28 np0005588919 python3.9[177812]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:28 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:29 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:29 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:29.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:30 np0005588919 python3.9[178003]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:30 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:30 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:30 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:30.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:31.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:31 np0005588919 python3.9[178243]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:31 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:31 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:31 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:32.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:32 np0005588919 python3.9[178529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:32 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:33 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:33 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:33.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:34.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:12:34 np0005588919 python3.9[178757]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:34 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:34 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:34 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:35.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:12:36 np0005588919 python3.9[178948]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:36 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:36 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:36 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:36.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:36 np0005588919 podman[178987]: 2026-01-20 14:12:36.656548188 +0000 UTC m=+0.126069615 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:12:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:37.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:37 np0005588919 python3.9[179163]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:38.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:38 np0005588919 python3.9[179319]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:38 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:38 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:38 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:39.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:39 np0005588919 python3.9[179510]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 20 09:12:39 np0005588919 systemd[1]: Reloading.
Jan 20 09:12:39 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:12:39 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:12:40 np0005588919 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 20 09:12:40 np0005588919 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 20 09:12:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:40.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:41 np0005588919 python3.9[179703]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:41.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:42 np0005588919 python3.9[179859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:42.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:43 np0005588919 python3.9[180014]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:43.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:43 np0005588919 python3.9[180170]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:44.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:44 np0005588919 python3.9[180325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:45.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:45 np0005588919 python3.9[180480]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:46.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:46 np0005588919 python3.9[180686]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:12:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:47.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:48.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:48 np0005588919 python3.9[180842]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:49.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:49 np0005588919 podman[180969]: 2026-01-20 14:12:49.3333935 +0000 UTC m=+0.118311234 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 20 09:12:49 np0005588919 python3.9[181012]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:50.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:12:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:51.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:12:51 np0005588919 python3.9[181222]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:12:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:52.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:12:52 np0005588919 python3.9[181378]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:53.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:53 np0005588919 python3.9[181533]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:54.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:54 np0005588919 python3.9[181689]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:55.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:12:55 np0005588919 python3.9[181844]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 20 09:12:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:56.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:57.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:12:58.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:58 np0005588919 python3.9[182001]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:12:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:12:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:12:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:12:59.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:12:59 np0005588919 python3.9[182153]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:00 np0005588919 python3.9[182306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:00.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:00 np0005588919 python3.9[182458]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:01.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:01 np0005588919 python3.9[182610]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:02 np0005588919 python3.9[182763]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:13:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:02.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:03 np0005588919 python3.9[182913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:13:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:03.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:04 np0005588919 python3.9[183066]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:04.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:04 np0005588919 python3.9[183191]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918383.4060566-1647-21936290292157/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:05.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:05 np0005588919 python3.9[183343]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:06 np0005588919 python3.9[183469]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918385.106534-1647-227078017862475/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:06.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:07 np0005588919 podman[183593]: 2026-01-20 14:13:07.176219466 +0000 UTC m=+0.186266966 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:13:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:07.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:07 np0005588919 python3.9[183637]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:08 np0005588919 python3.9[183772]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918386.652252-1647-229829906464272/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:08.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:08 np0005588919 python3.9[183924]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:09.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:09 np0005588919 python3.9[184049]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918388.2250416-1647-73206054279324/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:10 np0005588919 python3.9[184202]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:10.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:11 np0005588919 python3.9[184377]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918389.6745286-1647-215140573588030/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:11.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:11 np0005588919 python3.9[184530]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:12.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:12 np0005588919 python3.9[184655]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918391.219992-1647-105273642242027/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:13.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:13 np0005588919 python3.9[184807]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:13 np0005588919 python3.9[184931]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918392.7503943-1647-253771963565580/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:14.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:14 np0005588919 python3.9[185083]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:15.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:15 np0005588919 python3.9[185208]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768918394.084506-1647-77345410761574/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:16 np0005588919 python3.9[185361]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 20 09:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:13:16.373 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:13:16.374 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:13:16.374 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:13:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:16.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:17 np0005588919 python3.9[185514]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:17.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:17 np0005588919 python3.9[185667]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:18.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:18 np0005588919 python3.9[185819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:19 np0005588919 python3.9[185971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:19.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:19 np0005588919 podman[186096]: 2026-01-20 14:13:19.858065944 +0000 UTC m=+0.128284119 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 20 09:13:20 np0005588919 python3.9[186142]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:20.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:20 np0005588919 python3.9[186294]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:21.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:21 np0005588919 python3.9[186446]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:22 np0005588919 python3.9[186599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:22.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:23 np0005588919 python3.9[186751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:23.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:23 np0005588919 python3.9[186904]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:24.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:24 np0005588919 python3.9[187056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:25.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:25 np0005588919 python3.9[187208]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:26 np0005588919 python3.9[187361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:26.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:27 np0005588919 python3.9[187513]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:27.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:27 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:13:28 np0005588919 python3.9[187666]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:28.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:28 np0005588919 python3.9[187789]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918407.536898-2310-275333947936575/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:29 np0005588919 python3.9[187941]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:30 np0005588919 python3.9[188065]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918408.9991007-2310-198369940559772/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:30.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:30 np0005588919 python3.9[188267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:31.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:31 np0005588919 python3.9[188390]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918410.4076924-2310-231478974680917/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:32 np0005588919 python3.9[188543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:32.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:33 np0005588919 python3.9[188666]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918411.7933903-2310-278971580310133/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:33.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:33 np0005588919 python3.9[188819]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:34.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:35.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:37.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:37.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:37 np0005588919 python3.9[188942]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918413.1734455-2310-26613866388463/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:38 np0005588919 podman[188945]: 2026-01-20 14:13:38.182130602 +0000 UTC m=+0.199846597 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 20 09:13:38 np0005588919 python3.9[189123]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:39.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:39.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:39 np0005588919 python3.9[189246]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918418.092242-2310-252288190296608/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:40 np0005588919 python3.9[189399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:40 np0005588919 python3.9[189522]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918419.523305-2310-257019743708740/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:41.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:41.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:41 np0005588919 python3.9[189674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:41 np0005588919 python3.9[189798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918420.8661747-2310-130548258020543/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:42 np0005588919 python3.9[189950]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:43.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:43.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:43 np0005588919 python3.9[190073]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918422.125068-2310-226069141755676/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:44 np0005588919 python3.9[190226]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.606991) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424607115, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2647, "num_deletes": 501, "total_data_size": 6153070, "memory_usage": 6229720, "flush_reason": "Manual Compaction"}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424631284, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 2361262, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13694, "largest_seqno": 16336, "table_properties": {"data_size": 2353696, "index_size": 3740, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 21565, "raw_average_key_size": 19, "raw_value_size": 2335088, "raw_average_value_size": 2130, "num_data_blocks": 169, "num_entries": 1096, "num_filter_entries": 1096, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918191, "oldest_key_time": 1768918191, "file_creation_time": 1768918424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 24371 microseconds, and 11176 cpu microseconds.
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.631363) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 2361262 bytes OK
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.631386) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.633998) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.634028) EVENT_LOG_v1 {"time_micros": 1768918424634019, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.634054) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 6140736, prev total WAL file size 6140736, number of live WAL files 2.
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.636626) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(2305KB)], [27(9677KB)]
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424636682, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 12271262, "oldest_snapshot_seqno": -1}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4320 keys, 8321044 bytes, temperature: kUnknown
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424696419, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 8321044, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8290807, "index_size": 18351, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10821, "raw_key_size": 106684, "raw_average_key_size": 24, "raw_value_size": 8211287, "raw_average_value_size": 1900, "num_data_blocks": 773, "num_entries": 4320, "num_filter_entries": 4320, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.696683) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 8321044 bytes
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.710830) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.2 rd, 139.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 9.5 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(8.7) write-amplify(3.5) OK, records in: 5233, records dropped: 913 output_compression: NoCompression
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.710899) EVENT_LOG_v1 {"time_micros": 1768918424710856, "job": 14, "event": "compaction_finished", "compaction_time_micros": 59811, "compaction_time_cpu_micros": 35803, "output_level": 6, "num_output_files": 1, "total_output_size": 8321044, "num_input_records": 5233, "num_output_records": 4320, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424711699, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918424714994, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.636516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.715071) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.715079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.715083) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.715087) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:13:44.715091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:13:44 np0005588919 python3.9[190349]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918423.5945578-2310-149168487224057/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:45.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:45.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:45 np0005588919 python3.9[190501]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:46 np0005588919 python3.9[190625]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918424.877487-2310-233701375186516/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:46 np0005588919 python3.9[190850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:47.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:47.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:47 np0005588919 python3.9[191032]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918426.2347817-2310-76284868860818/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:13:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:13:48 np0005588919 python3.9[191185]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:48 np0005588919 auditd[701]: Audit daemon rotating log files
Jan 20 09:13:48 np0005588919 python3.9[191308]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918427.7473671-2310-52907660477716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:49.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:49.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:49 np0005588919 python3.9[191460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:13:50 np0005588919 podman[191514]: 2026-01-20 14:13:50.048432762 +0000 UTC m=+0.086723614 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:13:50 np0005588919 python3.9[191603]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918429.0384455-2310-126275348430243/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:51.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:51 np0005588919 python3.9[191803]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:13:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:51.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:52 np0005588919 python3.9[191959]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 20 09:13:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:53.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:53.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:54 np0005588919 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 20 09:13:54 np0005588919 python3.9[192166]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 20 09:13:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:55.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 20 09:13:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:13:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:55.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:55 np0005588919 python3.9[192318]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:56 np0005588919 python3.9[192471]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:56 np0005588919 python3.9[192623]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:13:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:57.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:13:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:57.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:13:57 np0005588919 python3.9[192776]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:58 np0005588919 python3.9[192928]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:13:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:13:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:13:59.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:13:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:13:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:13:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:13:59.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:13:59 np0005588919 python3.9[193080]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:00 np0005588919 python3.9[193233]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:01 np0005588919 python3.9[193385]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:01.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:01.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:01 np0005588919 python3.9[193538]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:03 np0005588919 python3.9[193690]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:03 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:03 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:03 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:03.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:03.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:03 np0005588919 systemd[1]: Starting libvirt logging daemon socket...
Jan 20 09:14:03 np0005588919 systemd[1]: Listening on libvirt logging daemon socket.
Jan 20 09:14:03 np0005588919 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 20 09:14:03 np0005588919 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 20 09:14:03 np0005588919 systemd[1]: Starting libvirt logging daemon...
Jan 20 09:14:03 np0005588919 systemd[1]: Started libvirt logging daemon.
Jan 20 09:14:04 np0005588919 python3.9[193883]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:04 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:04 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:04 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:04 np0005588919 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 20 09:14:04 np0005588919 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 20 09:14:04 np0005588919 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 20 09:14:04 np0005588919 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 20 09:14:04 np0005588919 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 20 09:14:04 np0005588919 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 20 09:14:04 np0005588919 systemd[1]: Starting libvirt nodedev daemon...
Jan 20 09:14:04 np0005588919 systemd[1]: Started libvirt nodedev daemon.
Jan 20 09:14:05 np0005588919 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 20 09:14:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:05.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:05.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:05 np0005588919 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 20 09:14:05 np0005588919 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 20 09:14:05 np0005588919 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 20 09:14:05 np0005588919 python3.9[194100]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:05 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:05 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:05 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:05 np0005588919 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 20 09:14:05 np0005588919 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 20 09:14:05 np0005588919 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 20 09:14:05 np0005588919 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 20 09:14:05 np0005588919 systemd[1]: Starting libvirt proxy daemon...
Jan 20 09:14:05 np0005588919 systemd[1]: Started libvirt proxy daemon.
Jan 20 09:14:06 np0005588919 setroubleshoot[194024]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a9a0f37e-5c50-4812-95fd-ebd8f6c1134a
Jan 20 09:14:06 np0005588919 setroubleshoot[194024]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 20 09:14:06 np0005588919 setroubleshoot[194024]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a9a0f37e-5c50-4812-95fd-ebd8f6c1134a
Jan 20 09:14:06 np0005588919 setroubleshoot[194024]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 20 09:14:06 np0005588919 python3.9[194321]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:06 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:06 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:06 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:07.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:07 np0005588919 systemd[1]: Listening on libvirt locking daemon socket.
Jan 20 09:14:07 np0005588919 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 20 09:14:07 np0005588919 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 20 09:14:07 np0005588919 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 20 09:14:07 np0005588919 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 20 09:14:07 np0005588919 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 20 09:14:07 np0005588919 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 20 09:14:07 np0005588919 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 20 09:14:07 np0005588919 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 20 09:14:07 np0005588919 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 20 09:14:07 np0005588919 systemd[1]: Starting libvirt QEMU daemon...
Jan 20 09:14:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:07.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:07 np0005588919 systemd[1]: Started libvirt QEMU daemon.
Jan 20 09:14:08 np0005588919 python3.9[194537]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:14:08 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:08 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:08 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:08 np0005588919 podman[194539]: 2026-01-20 14:14:08.377145995 +0000 UTC m=+0.156854266 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:14:08 np0005588919 systemd[1]: Starting libvirt secret daemon socket...
Jan 20 09:14:08 np0005588919 systemd[1]: Listening on libvirt secret daemon socket.
Jan 20 09:14:08 np0005588919 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 20 09:14:08 np0005588919 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 20 09:14:08 np0005588919 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 20 09:14:08 np0005588919 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 20 09:14:08 np0005588919 systemd[1]: Starting libvirt secret daemon...
Jan 20 09:14:08 np0005588919 systemd[1]: Started libvirt secret daemon.
Jan 20 09:14:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:09.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:09.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:09 np0005588919 python3.9[194776]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:10 np0005588919 python3.9[194928]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:14:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:11.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:11.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:11 np0005588919 python3.9[195130]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:12 np0005588919 python3.9[195285]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:14:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:13.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:13.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:13 np0005588919 python3.9[195435]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:13 np0005588919 python3.9[195557]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918452.8055089-3384-70850878791469/.source.xml follow=False _original_basename=secret.xml.j2 checksum=35bbbade4f0995b3fba698d107c82491080dc0dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:14 np0005588919 python3.9[195709]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine e399cf45-e6b6-5393-99f1-75c601d3f188#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:15.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:15.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:15 np0005588919 python3.9[195872]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:14:16.374 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:14:16.375 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:14:16.375 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:14:16 np0005588919 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 20 09:14:16 np0005588919 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.006s CPU time.
Jan 20 09:14:16 np0005588919 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 20 09:14:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:17.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:17.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:18 np0005588919 python3.9[196336]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:19.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:19.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:19 np0005588919 python3.9[196489]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:20 np0005588919 podman[196584]: 2026-01-20 14:14:20.244242057 +0000 UTC m=+0.089431921 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:14:20 np0005588919 python3.9[196631]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918459.1811197-3549-19146597404779/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:21.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:21.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:21 np0005588919 python3.9[196783]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:22 np0005588919 python3.9[196936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:22 np0005588919 python3.9[197014]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:23.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:23.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:23 np0005588919 python3.9[197166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:24 np0005588919 python3.9[197245]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.a4sskzt7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:25 np0005588919 python3.9[197397]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:25.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:25.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:25 np0005588919 python3.9[197475]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:26 np0005588919 python3.9[197628]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:27.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:27.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:27 np0005588919 python3[197781]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 20 09:14:28 np0005588919 python3.9[197934]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:29 np0005588919 python3.9[198012]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:29.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:29.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:30 np0005588919 python3.9[198165]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:30 np0005588919 python3.9[198290]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918469.4103603-3816-66047356827061/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:31.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:31.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:31 np0005588919 python3.9[198492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:32 np0005588919 python3.9[198571]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:32 np0005588919 python3.9[198723]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:33.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:33.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:33 np0005588919 python3.9[198801]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:34 np0005588919 python3.9[198954]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:34 np0005588919 python3.9[199079]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768918473.6725929-3933-125345158119954/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:35.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:35.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:35 np0005588919 python3.9[199232]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:36 np0005588919 python3.9[199384]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:37.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:37.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:37 np0005588919 python3.9[199540]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:38 np0005588919 podman[199664]: 2026-01-20 14:14:38.772193579 +0000 UTC m=+0.193731143 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 09:14:38 np0005588919 python3.9[199705]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:39.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:39.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:39 np0005588919 python3.9[199872]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:14:40 np0005588919 python3.9[200026]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:14:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:41.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:41.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:41 np0005588919 python3.9[200181]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:42 np0005588919 python3.9[200334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:43 np0005588919 python3.9[200457]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918481.9260406-4149-141649947844015/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:43.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:43 np0005588919 python3.9[200610]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:44 np0005588919 python3.9[200733]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918483.4803226-4194-253603264495037/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:45.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:45.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:45 np0005588919 python3.9[200885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:14:46 np0005588919 python3.9[201009]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918484.9260616-4239-101700953845709/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:14:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:47.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:47.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:47 np0005588919 python3.9[201161]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:14:47 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:47 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:47 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:47 np0005588919 systemd[1]: Reached target edpm_libvirt.target.
Jan 20 09:14:48 np0005588919 python3.9[201353]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 20 09:14:48 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:48 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:48 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:49 np0005588919 systemd[1]: Reloading.
Jan 20 09:14:49 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:14:49 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:14:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:49.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:49.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:50 np0005588919 systemd[1]: session-48.scope: Deactivated successfully.
Jan 20 09:14:50 np0005588919 systemd[1]: session-48.scope: Consumed 4min 768ms CPU time.
Jan 20 09:14:50 np0005588919 systemd-logind[783]: Session 48 logged out. Waiting for processes to exit.
Jan 20 09:14:50 np0005588919 systemd-logind[783]: Removed session 48.
Jan 20 09:14:51 np0005588919 podman[201450]: 2026-01-20 14:14:51.05431591 +0000 UTC m=+0.089471283 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:14:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:14:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:51.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:14:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:51.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:53.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:53.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:55.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:55.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:55 np0005588919 systemd-logind[783]: New session 49 of user zuul.
Jan 20 09:14:55 np0005588919 systemd[1]: Started Session 49 of User zuul.
Jan 20 09:14:56 np0005588919 python3.9[201808]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:14:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:14:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:14:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:14:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:14:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:57.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:57.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:14:58 np0005588919 python3.9[201963]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:14:58 np0005588919 network[201980]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:14:58 np0005588919 network[201981]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:14:58 np0005588919 network[201982]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:14:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:14:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:14:59.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:14:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:14:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:14:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:14:59.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:01.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:01.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:03.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:03.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:04 np0005588919 python3.9[202257]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 20 09:15:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:05.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:05.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:05 np0005588919 python3.9[202392]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:15:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:15:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:15:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:07.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:07.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:09 np0005588919 podman[202395]: 2026-01-20 14:15:09.112800887 +0000 UTC m=+0.147316295 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 09:15:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:09.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:09.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:11.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:11.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:12 np0005588919 python3.9[202625]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:13 np0005588919 python3.9[202777]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:13.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:13.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:14 np0005588919 python3.9[202931]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:15.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:15.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:15 np0005588919 python3.9[203083]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:15:16.375 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:15:16.376 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:15:16.376 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:15:16 np0005588919 python3.9[203237]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:17 np0005588919 python3.9[203360]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918515.8805401-246-74879687429696/.source.iscsi _original_basename=.8m6cec_c follow=False checksum=bbab9a6763471a42af22f3fd2e64e0a859c979e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:17.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:17.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:18 np0005588919 python3.9[203513]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.005283) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519005414, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1103, "num_deletes": 251, "total_data_size": 2529106, "memory_usage": 2561896, "flush_reason": "Manual Compaction"}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519022133, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1658900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16341, "largest_seqno": 17439, "table_properties": {"data_size": 1653996, "index_size": 2492, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10360, "raw_average_key_size": 19, "raw_value_size": 1644225, "raw_average_value_size": 3090, "num_data_blocks": 113, "num_entries": 532, "num_filter_entries": 532, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918425, "oldest_key_time": 1768918425, "file_creation_time": 1768918519, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 17014 microseconds, and 7913 cpu microseconds.
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.022309) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1658900 bytes OK
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.022388) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.024082) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.024105) EVENT_LOG_v1 {"time_micros": 1768918519024098, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.024128) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2523798, prev total WAL file size 2523798, number of live WAL files 2.
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.025746) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1620KB)], [30(8126KB)]
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519025823, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 9979944, "oldest_snapshot_seqno": -1}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4337 keys, 7965577 bytes, temperature: kUnknown
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519100578, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 7965577, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7935520, "index_size": 18107, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 107582, "raw_average_key_size": 24, "raw_value_size": 7855950, "raw_average_value_size": 1811, "num_data_blocks": 759, "num_entries": 4337, "num_filter_entries": 4337, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918519, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.100901) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 7965577 bytes
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.103246) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.3 rd, 106.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 7.9 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(10.8) write-amplify(4.8) OK, records in: 4852, records dropped: 515 output_compression: NoCompression
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.103276) EVENT_LOG_v1 {"time_micros": 1768918519103263, "job": 16, "event": "compaction_finished", "compaction_time_micros": 74849, "compaction_time_cpu_micros": 34902, "output_level": 6, "num_output_files": 1, "total_output_size": 7965577, "num_input_records": 4852, "num_output_records": 4337, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519105379, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918519108899, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.025637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.109042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.109051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.109054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.109057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:19.109060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:19.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:19 np0005588919 python3.9[203665]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:19 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:15:19 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:15:20 np0005588919 python3.9[203819]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:21 np0005588919 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 20 09:15:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:21.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:21.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:21 np0005588919 podman[203948]: 2026-01-20 14:15:21.834821723 +0000 UTC m=+0.078595303 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 20 09:15:22 np0005588919 python3.9[203996]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:22 np0005588919 systemd[1]: Reloading.
Jan 20 09:15:22 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:15:22 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:15:22 np0005588919 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 20 09:15:22 np0005588919 systemd[1]: Starting Open-iSCSI...
Jan 20 09:15:22 np0005588919 kernel: Loading iSCSI transport class v2.0-870.
Jan 20 09:15:22 np0005588919 systemd[1]: Started Open-iSCSI.
Jan 20 09:15:22 np0005588919 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 20 09:15:22 np0005588919 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 20 09:15:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:23.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:23.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.665666) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524665725, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 304, "num_deletes": 256, "total_data_size": 118358, "memory_usage": 124792, "flush_reason": "Manual Compaction"}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524668993, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 77787, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17445, "largest_seqno": 17743, "table_properties": {"data_size": 75866, "index_size": 149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4505, "raw_average_key_size": 16, "raw_value_size": 72071, "raw_average_value_size": 259, "num_data_blocks": 7, "num_entries": 278, "num_filter_entries": 278, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918519, "oldest_key_time": 1768918519, "file_creation_time": 1768918524, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 3356 microseconds, and 1161 cpu microseconds.
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.669032) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 77787 bytes OK
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.669046) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.670463) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.670476) EVENT_LOG_v1 {"time_micros": 1768918524670472, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.670493) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 116142, prev total WAL file size 116142, number of live WAL files 2.
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.670856) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(75KB)], [33(7778KB)]
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524671024, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 8043364, "oldest_snapshot_seqno": -1}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4095 keys, 7701612 bytes, temperature: kUnknown
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524733038, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 7701612, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7673374, "index_size": 16928, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 103856, "raw_average_key_size": 25, "raw_value_size": 7598151, "raw_average_value_size": 1855, "num_data_blocks": 696, "num_entries": 4095, "num_filter_entries": 4095, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918524, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.733429) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 7701612 bytes
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.735660) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.4 rd, 123.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 7.6 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(202.4) write-amplify(99.0) OK, records in: 4615, records dropped: 520 output_compression: NoCompression
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.735697) EVENT_LOG_v1 {"time_micros": 1768918524735681, "job": 18, "event": "compaction_finished", "compaction_time_micros": 62155, "compaction_time_cpu_micros": 20664, "output_level": 6, "num_output_files": 1, "total_output_size": 7701612, "num_input_records": 4615, "num_output_records": 4095, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524736248, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918524739318, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.670778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.739425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.739430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.739433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.739436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:15:24.739439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:15:24 np0005588919 python3.9[204197]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:15:24 np0005588919 network[204214]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:15:24 np0005588919 network[204215]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:15:24 np0005588919 network[204216]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:15:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:25.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:25.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:27.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:27.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:29.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:29.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:31.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:31.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:31 np0005588919 python3.9[204542]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:15:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:33.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:33.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:34 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:15:34 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:15:34 np0005588919 systemd[1]: Reloading.
Jan 20 09:15:34 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:15:34 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:15:34 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:15:35 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:15:35 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:15:35 np0005588919 systemd[1]: run-r1b0671403e194c929df19f8d19fb484c.service: Deactivated successfully.
Jan 20 09:15:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:35.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:35.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:36 np0005588919 python3.9[204861]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 20 09:15:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:15:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:37.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:15:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:37.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:37 np0005588919 python3.9[205013]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 20 09:15:38 np0005588919 python3.9[205170]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:39 np0005588919 python3.9[205293]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918537.9494734-510-106715687623771/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:39.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:39.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:40 np0005588919 podman[205394]: 2026-01-20 14:15:40.139713618 +0000 UTC m=+0.162359533 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:15:40 np0005588919 python3.9[205473]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:41.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:41.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:41 np0005588919 python3.9[205626]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:15:41 np0005588919 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 20 09:15:41 np0005588919 systemd[1]: Stopped Load Kernel Modules.
Jan 20 09:15:41 np0005588919 systemd[1]: Stopping Load Kernel Modules...
Jan 20 09:15:41 np0005588919 systemd[1]: Starting Load Kernel Modules...
Jan 20 09:15:41 np0005588919 systemd[1]: Finished Load Kernel Modules.
Jan 20 09:15:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:42 np0005588919 python3.9[205783]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:43.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:43.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:43 np0005588919 python3.9[205936]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:44 np0005588919 python3.9[206089]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:15:45 np0005588919 python3.9[206212]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918544.0613-663-93457358283600/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:45.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:45.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:45 np0005588919 python3.9[206365]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:46 np0005588919 python3.9[206518]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:47.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:47 np0005588919 python3.9[206670]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:48 np0005588919 python3.9[206823]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:49.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:49.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:49 np0005588919 python3.9[206975]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:50 np0005588919 python3.9[207128]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:50 np0005588919 python3.9[207280]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:51.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:51.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:51 np0005588919 python3.9[207432]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:15:52 np0005588919 podman[207531]: 2026-01-20 14:15:52.027966239 +0000 UTC m=+0.059912892 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:15:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:52 np0005588919 python3.9[207654]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:15:53 np0005588919 python3.9[207808]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:15:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:15:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:53.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:15:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:53.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:54 np0005588919 python3.9[207962]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:54 np0005588919 systemd[1]: Listening on multipathd control socket.
Jan 20 09:15:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:15:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:55.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:15:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:55.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:55 np0005588919 python3.9[208118]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:15:56 np0005588919 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 20 09:15:56 np0005588919 udevadm[208124]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 20 09:15:56 np0005588919 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 20 09:15:56 np0005588919 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 20 09:15:56 np0005588919 multipathd[208128]: --------start up--------
Jan 20 09:15:56 np0005588919 multipathd[208128]: read /etc/multipath.conf
Jan 20 09:15:56 np0005588919 multipathd[208128]: path checkers start up
Jan 20 09:15:56 np0005588919 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 20 09:15:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:15:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:57.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:57.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:58 np0005588919 python3.9[208288]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 20 09:15:58 np0005588919 python3.9[208440]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 20 09:15:58 np0005588919 kernel: Key type psk registered
Jan 20 09:15:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:15:59.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:15:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:15:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:15:59.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:15:59 np0005588919 python3.9[208604]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:16:00 np0005588919 python3.9[208727]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768918559.2134857-1053-76592514658852/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:01 np0005588919 python3.9[208879]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:01.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:16:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:01.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:16:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:02 np0005588919 python3.9[209032]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:02 np0005588919 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 20 09:16:02 np0005588919 systemd[1]: Stopped Load Kernel Modules.
Jan 20 09:16:02 np0005588919 systemd[1]: Stopping Load Kernel Modules...
Jan 20 09:16:02 np0005588919 systemd[1]: Starting Load Kernel Modules...
Jan 20 09:16:02 np0005588919 systemd[1]: Finished Load Kernel Modules.
Jan 20 09:16:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:03.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:03.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:03 np0005588919 python3.9[209188]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 20 09:16:04 np0005588919 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 20 09:16:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:05.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:05.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:05 np0005588919 systemd[1]: Reloading.
Jan 20 09:16:05 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:05 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:06 np0005588919 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 20 09:16:06 np0005588919 systemd[1]: Reloading.
Jan 20 09:16:06 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:06 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:06 np0005588919 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 20 09:16:06 np0005588919 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 20 09:16:06 np0005588919 lvm[209422]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 09:16:06 np0005588919 lvm[209422]: VG ceph_vg0 finished
Jan 20 09:16:06 np0005588919 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 20 09:16:06 np0005588919 systemd[1]: Starting man-db-cache-update.service...
Jan 20 09:16:06 np0005588919 systemd[1]: Reloading.
Jan 20 09:16:06 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:06 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:16:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:16:07 np0005588919 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 20 09:16:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:07.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:08 np0005588919 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 20 09:16:08 np0005588919 systemd[1]: Finished man-db-cache-update.service.
Jan 20 09:16:08 np0005588919 systemd[1]: man-db-cache-update.service: Consumed 1.433s CPU time.
Jan 20 09:16:08 np0005588919 systemd[1]: run-rc335477f764b47d2be14d45147f30965.service: Deactivated successfully.
Jan 20 09:16:08 np0005588919 python3.9[210791]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:08 np0005588919 systemd[1]: Stopping Open-iSCSI...
Jan 20 09:16:08 np0005588919 iscsid[204038]: iscsid shutting down.
Jan 20 09:16:08 np0005588919 systemd[1]: iscsid.service: Deactivated successfully.
Jan 20 09:16:08 np0005588919 systemd[1]: Stopped Open-iSCSI.
Jan 20 09:16:08 np0005588919 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 20 09:16:08 np0005588919 systemd[1]: Starting Open-iSCSI...
Jan 20 09:16:08 np0005588919 systemd[1]: Started Open-iSCSI.
Jan 20 09:16:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:09.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:09.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:09 np0005588919 python3.9[210947]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:16:09 np0005588919 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 20 09:16:09 np0005588919 multipathd[208128]: exit (signal)
Jan 20 09:16:09 np0005588919 multipathd[208128]: --------shut down-------
Jan 20 09:16:09 np0005588919 systemd[1]: multipathd.service: Deactivated successfully.
Jan 20 09:16:09 np0005588919 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 20 09:16:09 np0005588919 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 20 09:16:09 np0005588919 multipathd[210954]: --------start up--------
Jan 20 09:16:09 np0005588919 multipathd[210954]: read /etc/multipath.conf
Jan 20 09:16:09 np0005588919 multipathd[210954]: path checkers start up
Jan 20 09:16:09 np0005588919 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 20 09:16:10 np0005588919 podman[211085]: 2026-01-20 14:16:10.34794573 +0000 UTC m=+0.080251918 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:16:10 np0005588919 python3.9[211126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 20 09:16:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:11.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:11.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:11 np0005588919 python3.9[211294]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:13 np0005588919 python3.9[211496]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:16:13 np0005588919 systemd[1]: Reloading.
Jan 20 09:16:13 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:13 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:13.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:13.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:14 np0005588919 python3.9[211682]: ansible-ansible.builtin.service_facts Invoked
Jan 20 09:16:14 np0005588919 network[211699]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 20 09:16:14 np0005588919 network[211700]: 'network-scripts' will be removed from distribution in near future.
Jan 20 09:16:14 np0005588919 network[211701]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 20 09:16:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:15.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:16:16.376 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:16:16.377 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:16:16.377 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:16:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:17.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:17 np0005588919 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 20 09:16:17 np0005588919 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 20 09:16:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:19.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:19.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:21 np0005588919 python3.9[212029]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:21.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:21 np0005588919 python3.9[212183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:22 np0005588919 podman[212308]: 2026-01-20 14:16:22.580141987 +0000 UTC m=+0.076937406 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:16:22 np0005588919 python3.9[212350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:23.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:23 np0005588919 python3.9[212510]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:24 np0005588919 python3.9[212664]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 09:16:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:25.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 09:16:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:25.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:26 np0005588919 python3.9[212818]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:27 np0005588919 python3.9[212971]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:16:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:16:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:27.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:28 np0005588919 python3.9[213125]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:16:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:29.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:29 np0005588919 python3.9[213279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:30 np0005588919 python3.9[213431]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:31 np0005588919 python3.9[213583]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:31.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:31.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:32 np0005588919 python3.9[213784]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:32 np0005588919 python3.9[213938]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:33.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:33.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:33 np0005588919 python3.9[214091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:34 np0005588919 python3.9[214243]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:35 np0005588919 python3.9[214395]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:35.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:36 np0005588919 python3.9[214548]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:36 np0005588919 python3.9[214700]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:37.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:37.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:37 np0005588919 python3.9[214852]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:38 np0005588919 python3.9[215005]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:39 np0005588919 python3.9[215157]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:39.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:39.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:39 np0005588919 python3.9[215310]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:40 np0005588919 python3.9[215462]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:41 np0005588919 podman[215586]: 2026-01-20 14:16:41.088512766 +0000 UTC m=+0.122778274 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 20 09:16:41 np0005588919 python3.9[215633]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:16:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:41.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:41.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:42 np0005588919 python3.9[215793]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:43 np0005588919 python3.9[215945]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 20 09:16:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:43.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:16:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:43.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:16:44 np0005588919 python3.9[216098]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:16:44 np0005588919 systemd[1]: Reloading.
Jan 20 09:16:44 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:16:44 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:16:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:45.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:45.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:45 np0005588919 python3.9[216285]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:46 np0005588919 python3.9[216439]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:46 np0005588919 python3.9[216592]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:47 np0005588919 python3.9[216745]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:47.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:47.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:49 np0005588919 python3.9[216899]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:16:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:49.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:16:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:49.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:49 np0005588919 python3.9[217053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:50 np0005588919 python3.9[217206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:51 np0005588919 python3.9[217359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 20 09:16:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:51.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:51.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:53 np0005588919 podman[217535]: 2026-01-20 14:16:53.055816967 +0000 UTC m=+0.062213425 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 20 09:16:53 np0005588919 python3.9[217582]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:53.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:53.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:54 np0005588919 python3.9[217735]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:54 np0005588919 python3.9[217887]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:55.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:55.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:55 np0005588919 python3.9[218039]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:56 np0005588919 python3.9[218192]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:56 np0005588919 python3.9[218344]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:16:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:57.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:57.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:57 np0005588919 python3.9[218496]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:58 np0005588919 python3.9[218649]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:59 np0005588919 python3.9[218801]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:16:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:16:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:16:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:16:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:16:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:16:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:16:59.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:16:59 np0005588919 python3.9[218954]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:01.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:01.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:03.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:03.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:05.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:05.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:05 np0005588919 python3.9[219109]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 20 09:17:06 np0005588919 python3.9[219262]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 20 09:17:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:07.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:07.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:08 np0005588919 python3.9[219421]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 20 09:17:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:09.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:09 np0005588919 systemd-logind[783]: New session 50 of user zuul.
Jan 20 09:17:09 np0005588919 systemd[1]: Started Session 50 of User zuul.
Jan 20 09:17:09 np0005588919 systemd[1]: session-50.scope: Deactivated successfully.
Jan 20 09:17:09 np0005588919 systemd-logind[783]: Session 50 logged out. Waiting for processes to exit.
Jan 20 09:17:09 np0005588919 systemd-logind[783]: Removed session 50.
Jan 20 09:17:10 np0005588919 python3.9[219608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:11 np0005588919 python3.9[219729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918630.1009061-2661-124490268263576/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:11.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:11.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:11 np0005588919 podman[219853]: 2026-01-20 14:17:11.614014449 +0000 UTC m=+0.107378375 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:17:11 np0005588919 python3.9[219890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:12 np0005588919 python3.9[219979]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:12 np0005588919 python3.9[220179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:13 np0005588919 python3.9[220300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918632.3173077-2661-156486173364267/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:13.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:13.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:14 np0005588919 python3.9[220451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:14 np0005588919 python3.9[220572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918633.5302818-2661-72270075553730/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:15 np0005588919 python3.9[220722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:15.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:15.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:16 np0005588919 python3.9[220869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918634.947498-2661-44937292821040/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:17:16.377 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:17:16.378 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:17:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:17:16 np0005588919 python3.9[221125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:17:17 np0005588919 python3.9[221246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918636.3454983-2661-175762974799744/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:17.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:17.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:18 np0005588919 python3.9[221399]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:19 np0005588919 python3.9[221551]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:19.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:19.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:20 np0005588919 python3.9[221704]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:21 np0005588919 python3.9[221856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:21.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:21.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:21 np0005588919 python3.9[221979]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1768918640.594747-2981-138417503381827/.source _original_basename=.xz2y6jow follow=False checksum=159442c4cde0bdcbf09a3d9dce1a41964352533d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 20 09:17:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:17:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6467 writes, 26K keys, 6467 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6467 writes, 1151 syncs, 5.62 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 483 writes, 730 keys, 483 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 483 writes, 236 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 20 09:17:22 np0005588919 python3.9[222132]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:23 np0005588919 podman[222258]: 2026-01-20 14:17:23.272926872 +0000 UTC m=+0.088115317 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:17:23 np0005588919 python3.9[222297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:23.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:23.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:23 np0005588919 python3.9[222426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918642.9142601-3060-207180423606463/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:24 np0005588919 python3.9[222600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 20 09:17:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:17:25 np0005588919 python3.9[222747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768918644.2289884-3104-262110577035734/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 20 09:17:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:25.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:25.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:26 np0005588919 python3.9[222900]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 20 09:17:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:27.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:27.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:27 np0005588919 python3.9[223053]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:17:29 np0005588919 python3[223205]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:17:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:29.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:31.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:31.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:33.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:33.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:35.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:37.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:37.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:37 np0005588919 ceph-mds[84722]: mds.beacon.cephfs.compute-1.rtofcx missed beacon ack from the monitors
Jan 20 09:17:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:39.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:39.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:39 np0005588919 podman[223220]: 2026-01-20 14:17:39.793797037 +0000 UTC m=+10.429437170 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:40 np0005588919 podman[223357]: 2026-01-20 14:17:40.0152152 +0000 UTC m=+0.082595884 container create 2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 20 09:17:40 np0005588919 podman[223357]: 2026-01-20 14:17:39.975751209 +0000 UTC m=+0.043131903 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:40 np0005588919 python3[223205]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 20 09:17:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:41.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:41 np0005588919 ceph-mds[84722]: mds.beacon.cephfs.compute-1.rtofcx missed beacon ack from the monitors
Jan 20 09:17:42 np0005588919 podman[223421]: 2026-01-20 14:17:42.079152626 +0000 UTC m=+0.114682668 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 09:17:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:43.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:45.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:46 np0005588919 python3.9[223575]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:17:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:47.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:17:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:47 np0005588919 python3.9[223730]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 20 09:17:48 np0005588919 python3.9[223882]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 20 09:17:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:49.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:49.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:50 np0005588919 python3[224035]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 20 09:17:50 np0005588919 podman[224072]: 2026-01-20 14:17:50.344266384 +0000 UTC m=+0.075946368 container create 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:17:50 np0005588919 podman[224072]: 2026-01-20 14:17:50.312353408 +0000 UTC m=+0.044033412 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 20 09:17:50 np0005588919 python3[224035]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 20 09:17:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:51.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:51 np0005588919 python3.9[224261]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:53 np0005588919 python3.9[224466]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:53.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:53 np0005588919 podman[224590]: 2026-01-20 14:17:53.824811345 +0000 UTC m=+0.087714753 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:17:54 np0005588919 python3.9[224636]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768918673.3099973-3392-113404318191523/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 20 09:17:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:17:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3495 writes, 19K keys, 3495 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3495 writes, 3495 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1370 writes, 6580 keys, 1370 commit groups, 1.0 writes per commit group, ingest: 14.40 MB, 0.02 MB/s#012Interval WAL: 1370 writes, 1370 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     54.6      0.37              0.08         9    0.041       0      0       0.0       0.0#012  L6      1/0    7.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.2    100.2     83.7      0.76              0.25         8    0.095     35K   4269       0.0       0.0#012 Sum      1/0    7.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.2     67.7     74.2      1.13              0.33        17    0.066     35K   4269       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.3    107.0    105.5      0.36              0.16         8    0.045     19K   2485       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    100.2     83.7      0.76              0.25         8    0.095     35K   4269       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     54.9      0.36              0.08         8    0.046       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.07 GB read, 0.06 MB/s read, 1.1 seconds#012Interval compaction: 0.04 GB write, 0.06 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 308.00 MB usage: 4.67 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 7.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(253,4.37 MB,1.4184%) FilterBlock(17,108.98 KB,0.0345552%) IndexBlock(17,203.77 KB,0.0646071%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:17:54 np0005588919 python3.9[224714]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 20 09:17:54 np0005588919 systemd[1]: Reloading.
Jan 20 09:17:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:54 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:17:54 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:17:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:55.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:55 np0005588919 python3.9[224825]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 20 09:17:55 np0005588919 systemd[1]: Reloading.
Jan 20 09:17:55 np0005588919 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 20 09:17:55 np0005588919 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 20 09:17:55 np0005588919 systemd[1]: Starting nova_compute container...
Jan 20 09:17:56 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:17:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 20 09:17:56 np0005588919 podman[224866]: 2026-01-20 14:17:56.100391075 +0000 UTC m=+0.115127541 container init 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:17:56 np0005588919 podman[224866]: 2026-01-20 14:17:56.108329171 +0000 UTC m=+0.123065617 container start 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2)
Jan 20 09:17:56 np0005588919 podman[224866]: nova_compute
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + sudo -E kolla_set_configs
Jan 20 09:17:56 np0005588919 systemd[1]: Started nova_compute container.
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Validating config file
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying service configuration files
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Deleting /etc/ceph
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Creating directory /etc/ceph
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/ceph
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Writing out command to execute
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:17:56 np0005588919 nova_compute[224882]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:17:56 np0005588919 nova_compute[224882]: ++ cat /run_command
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + CMD=nova-compute
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + ARGS=
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + sudo kolla_copy_cacerts
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + [[ ! -n '' ]]
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + . kolla_extend_start
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + echo 'Running command: '\''nova-compute'\'''
Jan 20 09:17:56 np0005588919 nova_compute[224882]: Running command: 'nova-compute'
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + umask 0022
Jan 20 09:17:56 np0005588919 nova_compute[224882]: + exec nova-compute
Jan 20 09:17:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:17:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:57.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:17:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:57.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:57 np0005588919 python3.9[225044]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.238 224886 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.238 224886 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.238 224886 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.239 224886 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.386 224886 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.414 224886 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:17:58 np0005588919 nova_compute[224882]: 2026-01-20 14:17:58.415 224886 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:17:58 np0005588919 python3.9[225197]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.418 224886 INFO nova.virt.driver [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 20 09:17:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:17:59.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:59 np0005588919 python3.9[225349]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 20 09:17:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:17:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:17:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:17:59.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.581 224886 INFO nova.compute.provider_config [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.608 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.609 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.609 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.609 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.609 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.610 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.610 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.610 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.610 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.610 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.611 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.611 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.611 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.611 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.611 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.612 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.612 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.612 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.612 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.613 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.613 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.613 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.613 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.613 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.614 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.614 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.614 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.614 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.615 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.615 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.615 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.615 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.616 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.616 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.616 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.616 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.617 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.617 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.617 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.617 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.617 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.618 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.618 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.618 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.618 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.619 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.619 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.619 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.619 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.620 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.621 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.622 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.623 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.624 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.625 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.626 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.627 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.628 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.629 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.630 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.631 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.632 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.633 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.634 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.635 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.636 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.637 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.637 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.637 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.637 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.638 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.639 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.640 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.641 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.642 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.642 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.642 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.643 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.643 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.643 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.643 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.644 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.644 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.644 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.644 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.645 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.645 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.645 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.645 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.646 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.646 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.646 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.646 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.647 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.647 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.647 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.647 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.648 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.648 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.648 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.648 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.648 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.649 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.649 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.649 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.649 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.650 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.650 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.650 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.650 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.651 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.651 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.651 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.651 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.651 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.652 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.652 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.652 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.652 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.653 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.653 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.653 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.653 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.654 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.654 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.654 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.654 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.655 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.655 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.655 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.655 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.655 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.656 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.656 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.656 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.656 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.657 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.657 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.657 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.657 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.658 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.658 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.658 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.658 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.659 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.659 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.659 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.659 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.659 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.660 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.660 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.660 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.660 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.661 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.661 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.661 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.661 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.662 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.662 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.662 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.662 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.663 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.663 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.663 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.663 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.664 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.664 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.664 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.664 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.664 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.665 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.665 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.665 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.665 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.666 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.666 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.666 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.666 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.667 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.667 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.667 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.667 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.668 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.668 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.668 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.668 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.668 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.669 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.669 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.669 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.669 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.670 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.670 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.670 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.670 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.671 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.671 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.671 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.671 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.672 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.672 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.672 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.672 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.672 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.673 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.673 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.673 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.673 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.674 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.674 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.674 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.674 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.675 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.675 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.676 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.676 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.676 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.677 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.677 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.677 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.678 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.678 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.678 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.679 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.680 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.680 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.680 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.680 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.680 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.681 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.682 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.683 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.684 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.685 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.686 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.687 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.688 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.689 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.690 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.691 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.692 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.693 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.694 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.695 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.696 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.697 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.698 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.699 224886 WARNING oslo_config.cfg [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 20 09:17:59 np0005588919 nova_compute[224882]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 20 09:17:59 np0005588919 nova_compute[224882]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 20 09:17:59 np0005588919 nova_compute[224882]: and ``live_migration_inbound_addr`` respectively.
Jan 20 09:17:59 np0005588919 nova_compute[224882]: ).  Its value may be silently ignored in the future.#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.700 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.701 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rbd_secret_uuid        = e399cf45-e6b6-5393-99f1-75c601d3f188 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.702 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.703 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.704 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.705 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.706 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.707 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.708 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.709 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.710 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.711 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.712 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.713 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.714 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.715 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.716 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.717 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.718 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.719 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.720 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.721 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.722 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.723 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.724 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.725 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.726 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.727 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.728 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.729 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.730 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.731 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.732 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.733 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.734 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.735 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.736 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.737 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.738 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.739 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.740 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.741 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.742 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.743 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.744 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.745 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.746 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.747 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.748 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.749 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.750 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.751 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.752 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.753 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.754 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.755 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.756 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.757 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.758 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.759 224886 DEBUG oslo_service.service [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.760 224886 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.814 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.815 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.815 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.815 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 20 09:17:59 np0005588919 systemd[1]: Starting libvirt QEMU daemon...
Jan 20 09:17:59 np0005588919 systemd[1]: Started libvirt QEMU daemon.
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.911 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fbf4ad328b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.914 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fbf4ad328b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.915 224886 INFO nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.945 224886 WARNING nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 20 09:17:59 np0005588919 nova_compute[224882]: 2026-01-20 14:17:59.946 224886 DEBUG nova.virt.libvirt.volume.mount [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 2026-01-20 14:18:00.869 224886 INFO nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Libvirt host capabilities <capabilities>
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <host>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <uuid>870b1f1c-f19c-477b-b282-ee6eeba50974</uuid>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <cpu>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <arch>x86_64</arch>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model>EPYC-Rome-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <vendor>AMD</vendor>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <microcode version='16777317'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <signature family='23' model='49' stepping='0'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='x2apic'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='tsc-deadline'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='osxsave'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='hypervisor'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='tsc_adjust'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='spec-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='stibp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='arch-capabilities'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='ssbd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='cmp_legacy'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='topoext'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='virt-ssbd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='lbrv'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='tsc-scale'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='vmcb-clean'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='pause-filter'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='pfthreshold'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='svme-addr-chk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='rdctl-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='skip-l1dfl-vmentry'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='mds-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature name='pschange-mc-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <pages unit='KiB' size='4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <pages unit='KiB' size='2048'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <pages unit='KiB' size='1048576'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </cpu>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <power_management>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <suspend_mem/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </power_management>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <iommu support='no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <migration_features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <live/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <uri_transports>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <uri_transport>tcp</uri_transport>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <uri_transport>rdma</uri_transport>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </uri_transports>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </migration_features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <topology>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <cells num='1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <cell id='0'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <memory unit='KiB'>7864312</memory>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <pages unit='KiB' size='4'>1966078</pages>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <pages unit='KiB' size='2048'>0</pages>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <distances>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <sibling id='0' value='10'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          </distances>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          <cpus num='8'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:          </cpus>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        </cell>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </cells>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </topology>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <cache>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </cache>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <secmodel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model>selinux</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <doi>0</doi>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </secmodel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <secmodel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model>dac</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <doi>0</doi>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </secmodel>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  </host>
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <guest>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <os_type>hvm</os_type>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <arch name='i686'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <wordsize>32</wordsize>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <domain type='qemu'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <domain type='kvm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </arch>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <pae/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <nonpae/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <apic default='on' toggle='no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <cpuselection/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <deviceboot/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <externalSnapshot/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  </guest>
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <guest>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <os_type>hvm</os_type>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <arch name='x86_64'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <wordsize>64</wordsize>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <domain type='qemu'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <domain type='kvm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </arch>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <apic default='on' toggle='no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <cpuselection/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <deviceboot/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <externalSnapshot/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </features>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  </guest>
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 
Jan 20 09:18:00 np0005588919 nova_compute[224882]: </capabilities>
Jan 20 09:18:00 np0005588919 nova_compute[224882]: #033[00m
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 2026-01-20 14:18:00.878 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:00 np0005588919 nova_compute[224882]: 2026-01-20 14:18:00.904 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 20 09:18:00 np0005588919 nova_compute[224882]: <domainCapabilities>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <domain>kvm</domain>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <arch>i686</arch>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <vcpu max='240'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <iothreads supported='yes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <os supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <enum name='firmware'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <loader supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>rom</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>pflash</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <enum name='readonly'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>yes</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <enum name='secure'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </loader>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  </os>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:  <cpu>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <enum name='maximumMigratable'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <vendor>AMD</vendor>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='succor'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:    <mode name='custom' supported='yes'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Denverton'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v5'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:00 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>memfd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </memoryBacking>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>disk</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>floppy</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>lun</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ide</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>fdc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>sata</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </disk>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588919 python3.9[225562]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vnc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </graphics>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <video supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vga</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>none</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>bochs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </video>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='mode'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>requisite</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>optional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pci</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hostdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>random</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </rng>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>path</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>handle</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </filesystem>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emulator</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>external</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>2.0</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </tpm>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </redirdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </channel>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </crypto>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>passt</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </interface>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>isa</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </panic>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <console supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>null</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dev</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pipe</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stdio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>udp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tcp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </console>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='features'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vapic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>runtime</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>synic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stimer</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reset</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ipi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>avic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hyperv>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: </domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:00.910 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 20 09:18:01 np0005588919 nova_compute[224882]: <domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <domain>kvm</domain>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <arch>i686</arch>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <vcpu max='4096'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <iothreads supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <os supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='firmware'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <loader supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>rom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pflash</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='readonly'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>yes</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='secure'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </loader>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </os>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='maximumMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <vendor>AMD</vendor>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='succor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='custom' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>memfd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </memoryBacking>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>disk</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>floppy</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>lun</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>fdc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>sata</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </disk>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vnc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </graphics>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <video supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vga</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>none</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>bochs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </video>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='mode'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>requisite</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>optional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pci</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hostdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>random</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </rng>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>path</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>handle</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </filesystem>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emulator</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>external</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>2.0</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </tpm>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </redirdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </channel>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </crypto>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>passt</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </interface>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>isa</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </panic>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <console supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>null</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dev</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pipe</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stdio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>udp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tcp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </console>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='features'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vapic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>runtime</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>synic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stimer</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reset</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ipi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>avic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hyperv>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: </domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.011 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.019 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 20 09:18:01 np0005588919 nova_compute[224882]: <domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <domain>kvm</domain>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <arch>x86_64</arch>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <vcpu max='240'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <iothreads supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <os supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='firmware'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <loader supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>rom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pflash</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='readonly'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>yes</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='secure'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </loader>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </os>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='maximumMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <vendor>AMD</vendor>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='succor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='custom' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>memfd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </memoryBacking>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>disk</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>floppy</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>lun</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ide</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>fdc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>sata</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </disk>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vnc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </graphics>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <video supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vga</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>none</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>bochs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </video>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='mode'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>requisite</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>optional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pci</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hostdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>random</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </rng>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>path</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>handle</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </filesystem>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emulator</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>external</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>2.0</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </tpm>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </redirdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </channel>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </crypto>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>passt</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </interface>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>isa</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </panic>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <console supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>null</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dev</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pipe</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stdio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>udp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tcp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </console>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='features'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vapic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>runtime</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>synic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stimer</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reset</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ipi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>avic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hyperv>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: </domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.099 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 20 09:18:01 np0005588919 nova_compute[224882]: <domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <domain>kvm</domain>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <arch>x86_64</arch>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <vcpu max='4096'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <iothreads supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <os supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='firmware'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>efi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <loader supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>rom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pflash</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='readonly'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>yes</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='secure'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>yes</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>no</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </loader>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </os>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='maximumMigratable'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>on</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>off</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <vendor>AMD</vendor>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='succor'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <mode name='custom' supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ddpd-u'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sha512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm3'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sm4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Denverton-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amd-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='auto-ibrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='perfmon-v2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbpb'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='stibp-always-on'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='EPYC-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-128'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-256'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx10-512'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='prefetchiti'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Haswell-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512er'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512pf'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fma4'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tbm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xop'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='amx-tile'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-bf16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-fp16'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bitalg'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrc'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fzrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='la57'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='taa-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ifma'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cmpccxadd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fbsdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='fsrs'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ibrs-all'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='intel-psfd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='lam'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mcdt-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pbrsb-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='psdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='serialize'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vaes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='hle'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='rtm'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512bw'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512cd'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512dq'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512f'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='avx512vl'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='invpcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pcid'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='pku'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='mpx'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='core-capability'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='split-lock-detect'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='cldemote'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='erms'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='gfni'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdir64b'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='movdiri'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='xsaves'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='athlon-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='core2duo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='coreduo-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='n270-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='ss'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <blockers model='phenom-v1'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnow'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <feature name='3dnowext'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </blockers>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </mode>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <memoryBacking supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <enum name='sourceType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>anonymous</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <value>memfd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </memoryBacking>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <disk supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='diskDevice'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>disk</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cdrom</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>floppy</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>lun</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>fdc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>sata</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </disk>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <graphics supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vnc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egl-headless</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </graphics>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <video supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='modelType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vga</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>cirrus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>none</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>bochs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ramfb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </video>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hostdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='mode'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>subsystem</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='startupPolicy'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>mandatory</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>requisite</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>optional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='subsysType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pci</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>scsi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='capsType'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='pciBackend'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hostdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <rng supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtio-non-transitional</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>random</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>egd</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </rng>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <filesystem supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='driverType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>path</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>handle</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>virtiofs</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </filesystem>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tpm supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-tis</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tpm-crb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emulator</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>external</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendVersion'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>2.0</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </tpm>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <redirdev supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='bus'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>usb</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </redirdev>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <channel supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </channel>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <crypto supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendModel'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>builtin</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </crypto>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <interface supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='backendType'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>default</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>passt</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </interface>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <panic supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='model'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>isa</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>hyperv</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </panic>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <console supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='type'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>null</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vc</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pty</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dev</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>file</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>pipe</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stdio</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>udp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tcp</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>unix</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>qemu-vdagent</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>dbus</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </console>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </devices>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <gic supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <genid supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <backup supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <async-teardown supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <s390-pv supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <ps2 supported='yes'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <tdx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sev supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <sgx supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <hyperv supported='yes'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <enum name='features'>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>relaxed</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vapic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>spinlocks</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vpindex</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>runtime</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>synic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>stimer</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reset</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>vendor_id</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>frequencies</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>reenlightenment</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>tlbflush</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>ipi</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>avic</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>emsr_bitmap</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <value>xmm_input</value>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </enum>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      <defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:      </defaults>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    </hyperv>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:    <launchSecurity supported='no'/>
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  </features>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: </domainCapabilities>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.160 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.160 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.161 224886 DEBUG nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.166 224886 INFO nova.virt.libvirt.host [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Secure Boot support detected#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.167 224886 INFO nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.168 224886 INFO nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.179 224886 DEBUG nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 20 09:18:01 np0005588919 nova_compute[224882]:  <model>Nehalem</model>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: </cpu>
Jan 20 09:18:01 np0005588919 nova_compute[224882]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.182 224886 DEBUG nova.virt.libvirt.driver [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.276 224886 INFO nova.virt.node [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Determined node identity bbb02880-a710-4ac1-8b2c-5c09765848d1 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.296 224886 WARNING nova.compute.manager [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Compute nodes ['bbb02880-a710-4ac1-8b2c-5c09765848d1'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.363 224886 INFO nova.compute.manager [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.453 224886 WARNING nova.compute.manager [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.453 224886 DEBUG oslo_concurrency.lockutils [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.453 224886 DEBUG oslo_concurrency.lockutils [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.454 224886 DEBUG oslo_concurrency.lockutils [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.454 224886 DEBUG nova.compute.resource_tracker [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.454 224886 DEBUG oslo_concurrency.processutils [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:01.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/996481584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:01 np0005588919 nova_compute[224882]: 2026-01-20 14:18:01.991 224886 DEBUG oslo_concurrency.processutils [None req-5b5754ad-a2eb-46b1-a3d6-44d73c82b4c2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:02 np0005588919 systemd[1]: Starting libvirt nodedev daemon...
Jan 20 09:18:02 np0005588919 systemd[1]: Started libvirt nodedev daemon.
Jan 20 09:18:02 np0005588919 python3.9[225764]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 20 09:18:02 np0005588919 systemd[1]: Stopping nova_compute container...
Jan 20 09:18:02 np0005588919 nova_compute[224882]: 2026-01-20 14:18:02.243 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:18:02 np0005588919 nova_compute[224882]: 2026-01-20 14:18:02.244 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:18:02 np0005588919 nova_compute[224882]: 2026-01-20 14:18:02.244 224886 DEBUG oslo_concurrency.lockutils [None req-52a0eeda-649c-48fd-9b77-6bd39b39a074 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:18:02 np0005588919 virtqemud[225396]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 20 09:18:02 np0005588919 virtqemud[225396]: hostname: compute-1
Jan 20 09:18:02 np0005588919 virtqemud[225396]: End of file while reading data: Input/output error
Jan 20 09:18:02 np0005588919 systemd[1]: libpod-9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301.scope: Deactivated successfully.
Jan 20 09:18:02 np0005588919 systemd[1]: libpod-9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301.scope: Consumed 3.678s CPU time.
Jan 20 09:18:02 np0005588919 conmon[224882]: conmon 9b6a80ba477be5ac2929 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301.scope/container/memory.events
Jan 20 09:18:02 np0005588919 podman[225791]: 2026-01-20 14:18:02.671493888 +0000 UTC m=+0.474344146 container died 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:18:02 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301-userdata-shm.mount: Deactivated successfully.
Jan 20 09:18:02 np0005588919 systemd[1]: var-lib-containers-storage-overlay-09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd-merged.mount: Deactivated successfully.
Jan 20 09:18:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:03.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:04 np0005588919 podman[225791]: 2026-01-20 14:18:04.985721827 +0000 UTC m=+2.788572035 container cleanup 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 09:18:04 np0005588919 podman[225791]: nova_compute
Jan 20 09:18:05 np0005588919 podman[225825]: nova_compute
Jan 20 09:18:05 np0005588919 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 20 09:18:05 np0005588919 systemd[1]: Stopped nova_compute container.
Jan 20 09:18:05 np0005588919 systemd[1]: Starting nova_compute container...
Jan 20 09:18:05 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:18:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09bfc735065820b3b853aee6741a29dbfb1f97c4dd98fe504baf6c65a72090cd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:05 np0005588919 podman[225839]: 2026-01-20 14:18:05.226396483 +0000 UTC m=+0.124769765 container init 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:18:05 np0005588919 podman[225839]: 2026-01-20 14:18:05.231784946 +0000 UTC m=+0.130158228 container start 9b6a80ba477be5ac2929fd65dfa7fd443cd6e560d8aa8cb2bdfccdc381ce7301 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible)
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + sudo -E kolla_set_configs
Jan 20 09:18:05 np0005588919 podman[225839]: nova_compute
Jan 20 09:18:05 np0005588919 systemd[1]: Started nova_compute container.
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Validating config file
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying service configuration files
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /etc/ceph
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Creating directory /etc/ceph
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/ceph
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Writing out command to execute
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:05 np0005588919 nova_compute[225855]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 20 09:18:05 np0005588919 nova_compute[225855]: ++ cat /run_command
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + CMD=nova-compute
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + ARGS=
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + sudo kolla_copy_cacerts
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + [[ ! -n '' ]]
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + . kolla_extend_start
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + echo 'Running command: '\''nova-compute'\'''
Jan 20 09:18:05 np0005588919 nova_compute[225855]: Running command: 'nova-compute'
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + umask 0022
Jan 20 09:18:05 np0005588919 nova_compute[225855]: + exec nova-compute
Jan 20 09:18:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:05.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:05.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:06 np0005588919 python3.9[226020]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 20 09:18:07 np0005588919 systemd[1]: Started libpod-conmon-2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3.scope.
Jan 20 09:18:07 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:18:07 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0d02bc159c28585f78064f19fbd91a200478408363f8183b598eed3d83849b0/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0d02bc159c28585f78064f19fbd91a200478408363f8183b598eed3d83849b0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0d02bc159c28585f78064f19fbd91a200478408363f8183b598eed3d83849b0/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.172 225859 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.173 225859 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.173 225859 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.173 225859 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 20 09:18:07 np0005588919 podman[226045]: 2026-01-20 14:18:07.174670656 +0000 UTC m=+0.212338352 container init 2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:18:07 np0005588919 podman[226045]: 2026-01-20 14:18:07.183200029 +0000 UTC m=+0.220867715 container start 2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:18:07 np0005588919 python3.9[226020]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Applying nova statedir ownership
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 20 09:18:07 np0005588919 nova_compute_init[226068]: INFO:nova_statedir:Nova statedir ownership complete
Jan 20 09:18:07 np0005588919 systemd[1]: libpod-2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588919 podman[226069]: 2026-01-20 14:18:07.25997253 +0000 UTC m=+0.045478913 container died 2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.304 225859 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.327 225859 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.328 225859 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:18:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3-userdata-shm.mount: Deactivated successfully.
Jan 20 09:18:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c0d02bc159c28585f78064f19fbd91a200478408363f8183b598eed3d83849b0-merged.mount: Deactivated successfully.
Jan 20 09:18:07 np0005588919 podman[226082]: 2026-01-20 14:18:07.362125721 +0000 UTC m=+0.091141450 container cleanup 2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm)
Jan 20 09:18:07 np0005588919 systemd[1]: libpod-conmon-2f02162efb4537956962187671675fe84df08f5b348c51f1cd15ed9e22a6e5c3.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:07.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:07.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.790 225859 INFO nova.virt.driver [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.913 225859 INFO nova.compute.provider_config [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.924 225859 DEBUG oslo_concurrency.lockutils [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.924 225859 DEBUG oslo_concurrency.lockutils [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.924 225859 DEBUG oslo_concurrency.lockutils [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.924 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.924 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.925 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.926 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.927 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.928 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.929 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.930 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.930 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.930 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.930 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.930 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.931 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.932 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.933 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.934 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.935 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.936 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.937 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.938 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.939 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.940 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.941 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.942 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.943 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.944 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.945 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.946 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.946 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.946 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.946 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.946 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.947 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.947 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.947 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.947 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.947 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.948 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.949 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.950 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.951 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.952 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.953 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.954 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.955 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.956 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.957 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.958 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.959 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.960 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.961 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.962 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.963 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.964 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.965 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.966 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.967 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.968 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.969 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.970 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.971 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.972 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.973 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.974 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.975 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.976 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.977 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.977 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.977 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.977 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.977 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.978 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.979 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.980 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.981 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.982 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.982 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.982 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.982 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.982 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.983 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.985 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.986 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.987 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 systemd[1]: session-49.scope: Deactivated successfully.
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 systemd[1]: session-49.scope: Consumed 2min 10.933s CPU time.
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.988 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 systemd-logind[783]: Session 49 logged out. Waiting for processes to exit.
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.989 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.990 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 systemd-logind[783]: Removed session 49.
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.991 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.992 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.993 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.994 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.995 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.996 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.997 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.998 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.998 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.998 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.999 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.999 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:07 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.999 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:07.999 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.000 225859 WARNING oslo_config.cfg [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 20 09:18:08 np0005588919 nova_compute[225855]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 20 09:18:08 np0005588919 nova_compute[225855]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 20 09:18:08 np0005588919 nova_compute[225855]: and ``live_migration_inbound_addr`` respectively.
Jan 20 09:18:08 np0005588919 nova_compute[225855]: ).  Its value may be silently ignored in the future.#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.000 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.000 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.001 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.001 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.001 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.001 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.002 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.002 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.002 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.002 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.003 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.003 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.003 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.003 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.004 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.004 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.004 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.004 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.004 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rbd_secret_uuid        = e399cf45-e6b6-5393-99f1-75c601d3f188 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.005 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.005 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.005 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.005 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.006 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.006 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.006 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.006 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.007 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.007 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.007 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.007 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.008 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.008 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.008 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.008 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.009 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.009 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.009 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.010 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.010 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.010 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.011 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.011 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.011 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.011 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.012 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.012 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.012 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.012 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.013 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.013 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.013 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.013 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.014 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.014 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.014 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.014 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.015 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.015 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.015 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.015 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.015 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.016 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.016 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.016 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.016 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.016 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.017 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.017 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.017 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.017 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.018 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.018 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.018 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.018 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.019 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.019 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.019 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.019 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.019 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.020 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.020 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.020 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.020 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.021 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.021 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.021 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.021 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.022 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.022 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.022 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.022 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.023 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.023 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.023 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.023 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.024 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.024 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.024 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.024 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.025 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.025 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.025 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.025 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.025 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.026 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.026 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.026 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.026 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.026 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.027 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.027 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.027 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.027 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.027 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.028 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.028 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.028 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.028 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.028 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.029 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.029 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.029 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.029 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.030 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.030 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.030 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.030 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.031 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.031 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.031 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.031 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.031 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.032 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.032 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.032 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.032 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.032 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.033 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.033 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.034 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.034 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.034 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.034 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.035 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.035 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.035 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.035 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.036 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.036 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.036 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.036 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.036 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.037 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.037 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.037 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.037 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.038 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.038 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.038 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.038 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.038 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.039 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.039 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.039 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.039 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.039 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.040 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.040 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.040 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.040 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.041 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.041 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.041 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.041 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.041 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.042 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.042 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.042 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.042 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.043 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.043 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.043 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.043 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.044 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.044 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.044 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.044 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.045 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.045 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.045 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.045 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.045 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.046 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.046 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.046 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.046 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.047 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.047 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.047 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.047 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.047 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.048 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.048 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.048 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.048 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.049 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.049 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.049 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.049 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.049 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.050 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.051 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.051 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.051 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.051 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.051 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.052 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.052 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.052 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.052 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.053 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.053 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.053 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.053 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.053 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.054 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.054 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.054 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.054 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.054 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.055 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.055 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.055 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.055 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.055 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.056 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.056 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.056 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.056 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.057 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.057 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.057 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.058 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.058 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.058 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.058 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.059 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.059 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.059 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.059 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.060 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.060 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.060 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.060 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.060 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.061 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.061 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.061 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.061 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.061 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.062 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.062 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.062 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.062 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.062 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.063 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.063 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.063 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.063 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.064 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.064 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.064 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.064 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.065 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.065 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.065 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.065 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.065 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.066 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.067 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.067 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.067 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.067 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.068 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.069 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.070 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.071 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.072 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.073 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.074 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.075 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.075 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.075 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.075 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.075 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.076 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.077 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.078 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.079 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.080 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.081 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.081 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.081 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.081 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.081 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.082 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.083 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.084 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.085 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.086 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.087 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.088 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.088 225859 DEBUG oslo_service.service [None req-e24e79cd-c034-4520-8502-25699a19da22 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.088 225859 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.106 225859 INFO nova.virt.node [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Determined node identity bbb02880-a710-4ac1-8b2c-5c09765848d1 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.107 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.107 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.108 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.108 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.123 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fdd113dbd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.126 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fdd113dbd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.127 225859 INFO nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.142 225859 INFO nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt host capabilities <capabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <host>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <uuid>870b1f1c-f19c-477b-b282-ee6eeba50974</uuid>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <arch>x86_64</arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model>EPYC-Rome-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <vendor>AMD</vendor>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <microcode version='16777317'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <signature family='23' model='49' stepping='0'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='x2apic'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='tsc-deadline'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='osxsave'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='hypervisor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='tsc_adjust'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='spec-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='stibp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='arch-capabilities'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='cmp_legacy'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='topoext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='virt-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='lbrv'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='tsc-scale'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='vmcb-clean'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='pause-filter'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='pfthreshold'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='svme-addr-chk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='rdctl-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='skip-l1dfl-vmentry'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='mds-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature name='pschange-mc-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <pages unit='KiB' size='4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <pages unit='KiB' size='2048'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <pages unit='KiB' size='1048576'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <power_management>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <suspend_mem/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </power_management>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <iommu support='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <migration_features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <live/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <uri_transports>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <uri_transport>tcp</uri_transport>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <uri_transport>rdma</uri_transport>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </uri_transports>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </migration_features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <topology>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <cells num='1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <cell id='0'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <memory unit='KiB'>7864312</memory>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <pages unit='KiB' size='4'>1966078</pages>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <pages unit='KiB' size='2048'>0</pages>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <distances>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <sibling id='0' value='10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          </distances>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          <cpus num='8'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:          </cpus>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        </cell>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </cells>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </topology>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <cache>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </cache>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <secmodel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model>selinux</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <doi>0</doi>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </secmodel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <secmodel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model>dac</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <doi>0</doi>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </secmodel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </host>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <guest>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <os_type>hvm</os_type>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <arch name='i686'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <wordsize>32</wordsize>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <domain type='qemu'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <domain type='kvm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <pae/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <nonpae/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <apic default='on' toggle='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <cpuselection/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <deviceboot/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <externalSnapshot/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </guest>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <guest>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <os_type>hvm</os_type>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <arch name='x86_64'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <wordsize>64</wordsize>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <domain type='qemu'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <domain type='kvm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <acpi default='on' toggle='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <apic default='on' toggle='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <cpuselection/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <deviceboot/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <disksnapshot default='on' toggle='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <externalSnapshot/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </guest>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </capabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: #033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.152 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.156 225859 DEBUG nova.virt.libvirt.volume.mount [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.159 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 20 09:18:08 np0005588919 nova_compute[225855]: <domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <domain>kvm</domain>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <arch>i686</arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <vcpu max='4096'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <iothreads supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <os supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='firmware'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <loader supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>rom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pflash</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='readonly'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>yes</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='secure'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </loader>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='maximumMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <vendor>AMD</vendor>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='succor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='custom' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <memoryBacking supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='sourceType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>anonymous</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>memfd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </memoryBacking>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <disk supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='diskDevice'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>disk</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cdrom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>floppy</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>lun</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>fdc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>sata</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <graphics supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vnc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egl-headless</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <video supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='modelType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vga</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cirrus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>none</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>bochs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ramfb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hostdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='mode'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>subsystem</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='startupPolicy'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>mandatory</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>requisite</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>optional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='subsysType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pci</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='capsType'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='pciBackend'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hostdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <rng supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>random</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <filesystem supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='driverType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>path</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>handle</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtiofs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </filesystem>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tpm supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-tis</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-crb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emulator</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>external</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendVersion'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>2.0</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </tpm>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <redirdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </redirdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <channel supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </channel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <crypto supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </crypto>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <interface supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>passt</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <panic supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>isa</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>hyperv</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </panic>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <console supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>null</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dev</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pipe</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stdio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>udp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tcp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu-vdagent</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <gic supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <genid supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backup supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <async-teardown supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <s390-pv supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <ps2 supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tdx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sev supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sgx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hyperv supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='features'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>relaxed</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vapic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>spinlocks</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vpindex</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>runtime</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>synic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stimer</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reset</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vendor_id</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>frequencies</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reenlightenment</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tlbflush</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ipi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>avic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emsr_bitmap</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>xmm_input</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hyperv>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <launchSecurity supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.170 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 20 09:18:08 np0005588919 nova_compute[225855]: <domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <domain>kvm</domain>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <arch>i686</arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <vcpu max='240'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <iothreads supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <os supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='firmware'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <loader supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>rom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pflash</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='readonly'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>yes</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='secure'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </loader>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='maximumMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <vendor>AMD</vendor>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='succor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='custom' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <memoryBacking supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='sourceType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>anonymous</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>memfd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </memoryBacking>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <disk supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='diskDevice'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>disk</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cdrom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>floppy</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>lun</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ide</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>fdc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>sata</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <graphics supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vnc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egl-headless</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <video supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='modelType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vga</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cirrus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>none</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>bochs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ramfb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hostdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='mode'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>subsystem</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='startupPolicy'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>mandatory</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>requisite</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>optional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='subsysType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pci</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='capsType'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='pciBackend'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hostdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <rng supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>random</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <filesystem supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='driverType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>path</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>handle</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtiofs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </filesystem>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tpm supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-tis</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-crb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emulator</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>external</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendVersion'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>2.0</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </tpm>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <redirdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </redirdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <channel supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </channel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <crypto supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </crypto>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <interface supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>passt</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <panic supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>isa</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>hyperv</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </panic>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <console supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>null</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dev</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pipe</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stdio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>udp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tcp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu-vdagent</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <gic supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <genid supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backup supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <async-teardown supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <s390-pv supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <ps2 supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tdx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sev supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sgx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hyperv supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='features'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>relaxed</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vapic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>spinlocks</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vpindex</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>runtime</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>synic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stimer</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reset</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vendor_id</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>frequencies</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reenlightenment</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tlbflush</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ipi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>avic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emsr_bitmap</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>xmm_input</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hyperv>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <launchSecurity supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.230 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.235 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 20 09:18:08 np0005588919 nova_compute[225855]: <domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <domain>kvm</domain>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <arch>x86_64</arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <vcpu max='4096'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <iothreads supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <os supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='firmware'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>efi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <loader supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>rom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pflash</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='readonly'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>yes</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='secure'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>yes</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </loader>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='maximumMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <vendor>AMD</vendor>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='succor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='custom' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <memoryBacking supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='sourceType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>anonymous</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>memfd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </memoryBacking>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <disk supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='diskDevice'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>disk</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cdrom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>floppy</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>lun</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>fdc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>sata</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <graphics supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vnc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egl-headless</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <video supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='modelType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vga</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cirrus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>none</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>bochs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ramfb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hostdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='mode'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>subsystem</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='startupPolicy'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>mandatory</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>requisite</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>optional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='subsysType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pci</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='capsType'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='pciBackend'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hostdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <rng supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>random</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <filesystem supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='driverType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>path</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>handle</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtiofs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </filesystem>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tpm supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-tis</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-crb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emulator</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>external</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendVersion'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>2.0</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </tpm>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <redirdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </redirdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <channel supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </channel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <crypto supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </crypto>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <interface supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>passt</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <panic supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>isa</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>hyperv</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </panic>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <console supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>null</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dev</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pipe</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stdio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>udp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tcp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu-vdagent</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <gic supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <genid supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backup supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <async-teardown supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <s390-pv supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <ps2 supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tdx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sev supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sgx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hyperv supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='features'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>relaxed</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vapic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>spinlocks</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vpindex</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>runtime</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>synic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stimer</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reset</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vendor_id</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>frequencies</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reenlightenment</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tlbflush</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ipi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>avic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emsr_bitmap</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>xmm_input</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hyperv>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <launchSecurity supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.314 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 20 09:18:08 np0005588919 nova_compute[225855]: <domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <path>/usr/libexec/qemu-kvm</path>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <domain>kvm</domain>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <arch>x86_64</arch>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <vcpu max='240'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <iothreads supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <os supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='firmware'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <loader supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>rom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pflash</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='readonly'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>yes</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='secure'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>no</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </loader>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-passthrough' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='hostPassthroughMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='maximum' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='maximumMigratable'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>on</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>off</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='host-model' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <vendor>AMD</vendor>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='x2apic'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-deadline'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='hypervisor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc_adjust'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='spec-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='stibp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='cmp_legacy'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='overflow-recov'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='succor'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='amd-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='virt-ssbd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lbrv'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='tsc-scale'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='vmcb-clean'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='flushbyasid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pause-filter'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='pfthreshold'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='svme-addr-chk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <feature policy='disable' name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <mode name='custom' supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Broadwell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cascadelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='ClearwaterForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ddpd-u'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sha512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm3'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sm4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Cooperlake-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Denverton-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Dhyana-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Genoa-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Milan-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Rome-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366490253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-Turin-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amd-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='auto-ibrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vp2intersect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fs-gs-base-ns'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibpb-brtype'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='no-nested-data-bp'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='null-sel-clr-base'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='perfmon-v2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbpb'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='srso-user-kernel-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='stibp-always-on'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='EPYC-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='GraniteRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-128'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-256'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx10-512'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='prefetchiti'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Haswell-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-noTSX'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v6'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Icelake-Server-v7'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='IvyBridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='KnightsMill-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4fmaps'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-4vnniw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512er'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512pf'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G4-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Opteron_G5-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fma4'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tbm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xop'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SapphireRapids-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='amx-tile'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-bf16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-fp16'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512-vpopcntdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bitalg'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vbmi2'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrc'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fzrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='la57'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='taa-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='tsx-ldtrk'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='SierraForest-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ifma'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-ne-convert'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx-vnni-int8'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bhi-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='bus-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cmpccxadd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fbsdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='fsrs'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ibrs-all'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='intel-psfd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ipred-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='lam'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mcdt-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pbrsb-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='psdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rrsba-ctrl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='sbdr-ssdp-no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='serialize'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vaes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='vpclmulqdq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Client-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='hle'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='rtm'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Skylake-Server-v5'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512bw'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512cd'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512dq'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512f'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='avx512vl'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='invpcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pcid'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='pku'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='mpx'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v2'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v3'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='core-capability'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='split-lock-detect'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='Snowridge-v4'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='cldemote'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='erms'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='gfni'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdir64b'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='movdiri'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='xsaves'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='athlon-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='core2duo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='coreduo-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='n270-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='ss'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <blockers model='phenom-v1'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnow'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <feature name='3dnowext'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </blockers>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </mode>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <memoryBacking supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <enum name='sourceType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>anonymous</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <value>memfd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </memoryBacking>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <disk supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='diskDevice'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>disk</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cdrom</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>floppy</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>lun</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ide</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>fdc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>sata</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <graphics supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vnc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egl-headless</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <video supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='modelType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vga</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>cirrus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>none</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>bochs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ramfb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hostdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='mode'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>subsystem</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='startupPolicy'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>mandatory</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>requisite</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>optional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='subsysType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pci</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>scsi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='capsType'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='pciBackend'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hostdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <rng supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtio-non-transitional</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>random</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>egd</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <filesystem supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='driverType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>path</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>handle</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>virtiofs</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </filesystem>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tpm supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-tis</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tpm-crb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emulator</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>external</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendVersion'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>2.0</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </tpm>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <redirdev supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='bus'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>usb</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </redirdev>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <channel supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </channel>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <crypto supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendModel'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>builtin</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </crypto>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <interface supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='backendType'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>default</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>passt</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <panic supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='model'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>isa</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>hyperv</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </panic>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <console supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='type'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>null</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vc</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pty</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dev</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>file</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>pipe</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stdio</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>udp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tcp</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>unix</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>qemu-vdagent</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>dbus</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <gic supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <vmcoreinfo supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <genid supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backingStoreInput supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <backup supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <async-teardown supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <s390-pv supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <ps2 supported='yes'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <tdx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sev supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <sgx supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <hyperv supported='yes'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <enum name='features'>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>relaxed</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vapic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>spinlocks</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vpindex</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>runtime</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>synic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>stimer</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reset</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>vendor_id</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>frequencies</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>reenlightenment</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>tlbflush</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>ipi</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>avic</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>emsr_bitmap</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <value>xmm_input</value>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </enum>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      <defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <spinlocks>4095</spinlocks>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <stimer_direct>on</stimer_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_direct>on</tlbflush_direct>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <tlbflush_extended>on</tlbflush_extended>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:      </defaults>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    </hyperv>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:    <launchSecurity supported='no'/>
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </domainCapabilities>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.391 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.392 225859 INFO nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Secure Boot support detected#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.393 225859 INFO nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.394 225859 INFO nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.403 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] cpu compare xml: <cpu match="exact">
Jan 20 09:18:08 np0005588919 nova_compute[225855]:  <model>Nehalem</model>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: </cpu>
Jan 20 09:18:08 np0005588919 nova_compute[225855]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.405 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.442 225859 INFO nova.virt.node [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Determined node identity bbb02880-a710-4ac1-8b2c-5c09765848d1 from /var/lib/nova/compute_id#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.458 225859 WARNING nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Compute nodes ['bbb02880-a710-4ac1-8b2c-5c09765848d1'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.487 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.547 225859 WARNING nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.547 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.548 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.548 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.548 225859 DEBUG nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:18:08 np0005588919 nova_compute[225855]: 2026-01-20 14:18:08.549 225859 DEBUG oslo_concurrency.processutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1912408237' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.009 225859 DEBUG oslo_concurrency.processutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.147 225859 WARNING nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.148 225859 DEBUG nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5250MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.148 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.149 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.162 225859 WARNING nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] No compute node record for compute-1.ctlplane.example.com:bbb02880-a710-4ac1-8b2c-5c09765848d1: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host bbb02880-a710-4ac1-8b2c-5c09765848d1 could not be found.#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.185 225859 INFO nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: bbb02880-a710-4ac1-8b2c-5c09765848d1#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.270 225859 DEBUG nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:18:09 np0005588919 nova_compute[225855]: 2026-01-20 14:18:09.270 225859 DEBUG nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:18:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:09.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.027 225859 INFO nova.scheduler.client.report [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [req-cb7924e1-7b16-434f-9bf5-c88347a74e1d] Created resource provider record via placement API for resource provider with UUID bbb02880-a710-4ac1-8b2c-5c09765848d1 and name compute-1.ctlplane.example.com.#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.058 225859 DEBUG oslo_concurrency.processutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:18:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:18:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2481846038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.548 225859 DEBUG oslo_concurrency.processutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.556 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 20 09:18:10 np0005588919 nova_compute[225855]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.557 225859 INFO nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.558 225859 DEBUG nova.compute.provider_tree [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.559 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.563 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Libvirt baseline CPU <cpu>
Jan 20 09:18:10 np0005588919 nova_compute[225855]:  <arch>x86_64</arch>
Jan 20 09:18:10 np0005588919 nova_compute[225855]:  <model>Nehalem</model>
Jan 20 09:18:10 np0005588919 nova_compute[225855]:  <vendor>AMD</vendor>
Jan 20 09:18:10 np0005588919 nova_compute[225855]:  <topology sockets="8" cores="1" threads="1"/>
Jan 20 09:18:10 np0005588919 nova_compute[225855]: </cpu>
Jan 20 09:18:10 np0005588919 nova_compute[225855]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.625 225859 DEBUG nova.scheduler.client.report [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Updated inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.625 225859 DEBUG nova.compute.provider_tree [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Updating resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.626 225859 DEBUG nova.compute.provider_tree [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.774 225859 DEBUG nova.compute.provider_tree [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Updating resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.803 225859 DEBUG nova.compute.resource_tracker [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.803 225859 DEBUG oslo_concurrency.lockutils [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.803 225859 DEBUG nova.service [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.883 225859 DEBUG nova.service [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 20 09:18:10 np0005588919 nova_compute[225855]: 2026-01-20 14:18:10.883 225859 DEBUG nova.servicegroup.drivers.db [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 20 09:18:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:13 np0005588919 podman[226228]: 2026-01-20 14:18:13.079344638 +0000 UTC m=+0.140153793 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:18:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:13.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:13.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:15.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:15.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:18:16.378 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:18:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:18:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:18:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:17.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:17.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:19.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:19.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:21.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:23.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:23.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:24 np0005588919 podman[226287]: 2026-01-20 14:18:24.013997933 +0000 UTC m=+0.062884537 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 20 09:18:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:25.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:25.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:18:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:18:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:18:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:18:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:27.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:27.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:29.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:29.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:31.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:31.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:18:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:33.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:33.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:35.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:35.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:37.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:37.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:39.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:39.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:41.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:18:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:18:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:18:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/513778578' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:18:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:18:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/513778578' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:18:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:43.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:43.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:44 np0005588919 podman[226670]: 2026-01-20 14:18:44.086042056 +0000 UTC m=+0.128500341 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 20 09:18:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:45.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:45.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:47.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:47.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:48 np0005588919 nova_compute[225855]: 2026-01-20 14:18:48.886 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:18:48 np0005588919 nova_compute[225855]: 2026-01-20 14:18:48.945 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:18:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:49.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:49.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:51.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:51.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:18:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:53.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:18:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:18:55 np0005588919 podman[226751]: 2026-01-20 14:18:55.069106575 +0000 UTC m=+0.087882368 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 20 09:18:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:55.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:55.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:57.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:57.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:18:59.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:18:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:18:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:18:59.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:18:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:01.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:03.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:03.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:05.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:05.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:19:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:07.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:07.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.814 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.815 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.816 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.816 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.817 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.817 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.818 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.818 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:19:07 np0005588919 nova_compute[225855]: 2026-01-20 14:19:07.819 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.049 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.050 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.051 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.051 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.052 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:19:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:19:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/777619478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.507 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:19:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:09.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:09.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.761 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.764 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5272MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.764 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:09 np0005588919 nova_compute[225855]: 2026-01-20 14:19:09.765 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.284 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.285 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.322 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:19:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:19:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4174787175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.801 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.808 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.841 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.842 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:19:10 np0005588919 nova_compute[225855]: 2026-01-20 14:19:10.842 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:11.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:11.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:13.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:13.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.105157) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754105253, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2304, "num_deletes": 251, "total_data_size": 5757540, "memory_usage": 5836592, "flush_reason": "Manual Compaction"}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754145794, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3767487, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17748, "largest_seqno": 20047, "table_properties": {"data_size": 3758175, "index_size": 5870, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18600, "raw_average_key_size": 20, "raw_value_size": 3739661, "raw_average_value_size": 4029, "num_data_blocks": 262, "num_entries": 928, "num_filter_entries": 928, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768918525, "oldest_key_time": 1768918525, "file_creation_time": 1768918754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 40786 microseconds, and 17600 cpu microseconds.
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.145939) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3767487 bytes OK
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.145969) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.148164) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.148191) EVENT_LOG_v1 {"time_micros": 1768918754148182, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.148218) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 5747533, prev total WAL file size 5747533, number of live WAL files 2.
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.150476) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3679KB)], [36(7521KB)]
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754150530, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11469099, "oldest_snapshot_seqno": -1}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4504 keys, 9395843 bytes, temperature: kUnknown
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754274582, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9395843, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9363675, "index_size": 19834, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11269, "raw_key_size": 112778, "raw_average_key_size": 25, "raw_value_size": 9279954, "raw_average_value_size": 2060, "num_data_blocks": 822, "num_entries": 4504, "num_filter_entries": 4504, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768918754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.275157) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9395843 bytes
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.277227) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 92.2 rd, 75.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 7.3 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 5023, records dropped: 519 output_compression: NoCompression
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.277258) EVENT_LOG_v1 {"time_micros": 1768918754277244, "job": 20, "event": "compaction_finished", "compaction_time_micros": 124412, "compaction_time_cpu_micros": 39957, "output_level": 6, "num_output_files": 1, "total_output_size": 9395843, "num_input_records": 5023, "num_output_records": 4504, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754279351, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768918754282375, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.150407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.282660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.282668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.282671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.282674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:19:14.282677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:19:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:15 np0005588919 podman[226875]: 2026-01-20 14:19:15.050705032 +0000 UTC m=+0.102532824 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:19:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:15.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:15.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:19:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:19:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:19:16.379 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:19:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:17.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:17.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:19.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:19.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:21.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:21.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:23.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:23.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:25.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:25.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:26 np0005588919 podman[226908]: 2026-01-20 14:19:26.046493573 +0000 UTC m=+0.079000955 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:19:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:27.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:27.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:29.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:29.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:31.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:31.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:33.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:33.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:19:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:19:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:19:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:35.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:35.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:37.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:37.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:39.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:39.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:19:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:41.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:43.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:43.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:45.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:45.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:46 np0005588919 podman[227168]: 2026-01-20 14:19:46.074242091 +0000 UTC m=+0.122744237 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:19:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:47.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:47.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:49.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:19:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:49.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:19:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:51.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:51.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:53.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:53.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:55.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:19:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:55.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:19:57 np0005588919 podman[227249]: 2026-01-20 14:19:57.045148926 +0000 UTC m=+0.081819895 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:19:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:57.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:19:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:19:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:19:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:19:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:19:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:19:59.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:20:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:01.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:01.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:03.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:03.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:05.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:07.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:20:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:20:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:20:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:20:09.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:20:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 20 09:20:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:20:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:20:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:20:09.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.836 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.836 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.874 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.875 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.875 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.893 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.893 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.893 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.894 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.895 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.895 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.895 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.896 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.896 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.929 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.930 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.930 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.930 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:20:10 np0005588919 nova_compute[225855]: 2026-01-20 14:20:10.931 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:20:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:20:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2351083615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:20:11 np0005588919 nova_compute[225855]: 2026-01-20 14:20:11.406 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:20:11 np0005588919 nova_compute[225855]: 2026-01-20 14:20:11.558 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:20:11 np0005588919 nova_compute[225855]: 2026-01-20 14:20:11.559 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5258MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:26:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:26:31 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1116669953' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.041 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.046 225859 DEBUG nova.compute.provider_tree [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.060 225859 DEBUG nova.scheduler.client.report [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.079 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:31 np0005588919 rsyslogd[1002]: imjournal: 4943 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.429 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Acquiring lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.429 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Acquired lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:26:31 np0005588919 nova_compute[225855]: 2026-01-20 14:26:31.429 225859 DEBUG nova.network.neutron [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.071 225859 DEBUG nova.network.neutron [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:26:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:32.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:32.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.448 225859 DEBUG nova.network.neutron [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.468 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Releasing lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.471 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.472 225859 INFO nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Creating image(s)#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.503 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.507 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.606 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.636 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.638 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Acquiring lock "c76a5946aff378ee70c25c8996110c54c3c4f8a1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:32 np0005588919 nova_compute[225855]: 2026-01-20 14:26:32.639 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "c76a5946aff378ee70c25c8996110c54c3c4f8a1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.016 225859 DEBUG nova.virt.libvirt.imagebackend [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0db5d54c-c1b5-4100-80fe-c616a5483520/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0db5d54c-c1b5-4100-80fe-c616a5483520/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.079 225859 DEBUG nova.virt.libvirt.imagebackend [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0db5d54c-c1b5-4100-80fe-c616a5483520/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.080 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] cloning images/0db5d54c-c1b5-4100-80fe-c616a5483520@snap to None/6091ab6e-2530-4b48-b482-00867d3c66c5_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.213 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "c76a5946aff378ee70c25c8996110c54c3c4f8a1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.375 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'migration_context' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:33 np0005588919 nova_compute[225855]: 2026-01-20 14:26:33.453 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] flattening vms/6091ab6e-2530-4b48-b482-00867d3c66c5_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:26:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:34.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:34.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.092 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Image rbd:vms/6091ab6e-2530-4b48-b482-00867d3c66c5_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.093 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.093 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Ensure instance console log exists: /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.093 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.093 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.094 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.095 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:26:03Z,direct_url=<?>,disk_format='raw',id=0db5d54c-c1b5-4100-80fe-c616a5483520,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-767584007-shelved',owner='14ebcff06a484899a9725832f1eddfdf',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:26:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.099 225859 WARNING nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.102 225859 DEBUG nova.virt.libvirt.host [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.103 225859 DEBUG nova.virt.libvirt.host [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.105 225859 DEBUG nova.virt.libvirt.host [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.105 225859 DEBUG nova.virt.libvirt.host [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.106 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.106 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:26:03Z,direct_url=<?>,disk_format='raw',id=0db5d54c-c1b5-4100-80fe-c616a5483520,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-767584007-shelved',owner='14ebcff06a484899a9725832f1eddfdf',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:26:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.106 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.107 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.108 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.108 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.108 225859 DEBUG nova.virt.hardware [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.108 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.390 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:26:35.726 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:26:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:26:35.727 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:26:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:26:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3726797702' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.854 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.890 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:35 np0005588919 nova_compute[225855]: 2026-01-20 14:26:35.896 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:36.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:36.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:26:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2320921482' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:26:36 np0005588919 nova_compute[225855]: 2026-01-20 14:26:36.905 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:36 np0005588919 nova_compute[225855]: 2026-01-20 14:26:36.907 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:36 np0005588919 nova_compute[225855]: 2026-01-20 14:26:36.950 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <uuid>6091ab6e-2530-4b48-b482-00867d3c66c5</uuid>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <name>instance-0000000e</name>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-767584007</nova:name>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:26:35</nova:creationTime>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:user uuid="8ea9f3cd2cbb462a8ecbb488e6a1a25d">tempest-UnshelveToHostMultiNodesTest-997401309-project-member</nova:user>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <nova:project uuid="14ebcff06a484899a9725832f1eddfdf">tempest-UnshelveToHostMultiNodesTest-997401309</nova:project>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="0db5d54c-c1b5-4100-80fe-c616a5483520"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="serial">6091ab6e-2530-4b48-b482-00867d3c66c5</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="uuid">6091ab6e-2530-4b48-b482-00867d3c66c5</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6091ab6e-2530-4b48-b482-00867d3c66c5_disk">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/console.log" append="off"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:26:36 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:26:36 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:26:36 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:26:36 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.121 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.122 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.123 225859 INFO nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Using config drive#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.152 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.175 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.252 225859 DEBUG nova.objects.instance [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lazy-loading 'keypairs' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.687 225859 INFO nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Creating config drive at /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.691 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp68y71qz1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.819 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp68y71qz1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.853 225859 DEBUG nova.storage.rbd_utils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] rbd image 6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:26:37 np0005588919 nova_compute[225855]: 2026-01-20 14:26:37.858 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config 6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.047 225859 DEBUG oslo_concurrency.processutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config 6091ab6e-2530-4b48-b482-00867d3c66c5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.049 225859 INFO nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Deleting local config drive /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5/disk.config because it was imported into RBD.#033[00m
Jan 20 09:26:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:38.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:38 np0005588919 systemd-machined[194361]: New machine qemu-7-instance-0000000e.
Jan 20 09:26:38 np0005588919 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.211 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:38 np0005588919 podman[233976]: 2026-01-20 14:26:38.224310947 +0000 UTC m=+0.083540960 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:26:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:38.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.877 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919198.8766315, 6091ab6e-2530-4b48-b482-00867d3c66c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.877 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.880 225859 DEBUG nova.compute.manager [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.881 225859 DEBUG nova.virt.libvirt.driver [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.885 225859 INFO nova.virt.libvirt.driver [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance spawned successfully.#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.912 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.916 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.950 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.950 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919198.8804593, 6091ab6e-2530-4b48-b482-00867d3c66c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.950 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] VM Started (Lifecycle Event)#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.973 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:26:38 np0005588919 nova_compute[225855]: 2026-01-20 14:26:38.978 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:26:39 np0005588919 nova_compute[225855]: 2026-01-20 14:26:39.004 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:26:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:26:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:40.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:26:40 np0005588919 nova_compute[225855]: 2026-01-20 14:26:40.337 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:42.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:42.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:26:42.730 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:26:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 20 09:26:43 np0005588919 nova_compute[225855]: 2026-01-20 14:26:43.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:44.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:44.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:26:45 np0005588919 nova_compute[225855]: 2026-01-20 14:26:45.341 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:46.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:47 np0005588919 nova_compute[225855]: 2026-01-20 14:26:47.711 225859 DEBUG nova.compute.manager [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:26:47 np0005588919 nova_compute[225855]: 2026-01-20 14:26:47.909 225859 DEBUG oslo_concurrency.lockutils [None req-1f59619e-257a-4cab-b645-d76ff4f626f0 c85759c031f744d2b9774757c7eb3cc2 95f7d246c566473eb07dba860a310578 - - default default] Lock "6091ab6e-2530-4b48-b482-00867d3c66c5" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 17.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:48.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:48 np0005588919 nova_compute[225855]: 2026-01-20 14:26:48.256 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:48.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.000 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Acquiring lock "6091ab6e-2530-4b48-b482-00867d3c66c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.001 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "6091ab6e-2530-4b48-b482-00867d3c66c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.001 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Acquiring lock "6091ab6e-2530-4b48-b482-00867d3c66c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.002 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "6091ab6e-2530-4b48-b482-00867d3c66c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.002 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "6091ab6e-2530-4b48-b482-00867d3c66c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.004 225859 INFO nova.compute.manager [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Terminating instance#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.006 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Acquiring lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.006 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Acquired lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.007 225859 DEBUG nova.network.neutron [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:26:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:50.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:50 np0005588919 nova_compute[225855]: 2026-01-20 14:26:50.375 225859 DEBUG nova.network.neutron [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:26:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 20 09:26:51 np0005588919 nova_compute[225855]: 2026-01-20 14:26:51.789 225859 DEBUG nova.network.neutron [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:26:51 np0005588919 nova_compute[225855]: 2026-01-20 14:26:51.819 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Releasing lock "refresh_cache-6091ab6e-2530-4b48-b482-00867d3c66c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:26:51 np0005588919 nova_compute[225855]: 2026-01-20 14:26:51.819 225859 DEBUG nova.compute.manager [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:26:51 np0005588919 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 20 09:26:51 np0005588919 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 12.313s CPU time.
Jan 20 09:26:51 np0005588919 systemd-machined[194361]: Machine qemu-7-instance-0000000e terminated.
Jan 20 09:26:52 np0005588919 nova_compute[225855]: 2026-01-20 14:26:52.045 225859 INFO nova.virt.libvirt.driver [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance destroyed successfully.#033[00m
Jan 20 09:26:52 np0005588919 nova_compute[225855]: 2026-01-20 14:26:52.045 225859 DEBUG nova.objects.instance [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lazy-loading 'resources' on Instance uuid 6091ab6e-2530-4b48-b482-00867d3c66c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:26:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:52.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:52.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:53 np0005588919 nova_compute[225855]: 2026-01-20 14:26:53.286 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:53 np0005588919 nova_compute[225855]: 2026-01-20 14:26:53.822 225859 INFO nova.virt.libvirt.driver [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Deleting instance files /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5_del#033[00m
Jan 20 09:26:53 np0005588919 nova_compute[225855]: 2026-01-20 14:26:53.823 225859 INFO nova.virt.libvirt.driver [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Deletion of /var/lib/nova/instances/6091ab6e-2530-4b48-b482-00867d3c66c5_del complete#033[00m
Jan 20 09:26:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:54.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:26:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:54.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:26:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:26:55 np0005588919 nova_compute[225855]: 2026-01-20 14:26:55.350 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:55 np0005588919 nova_compute[225855]: 2026-01-20 14:26:55.783 225859 INFO nova.compute.manager [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Took 3.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:26:55 np0005588919 nova_compute[225855]: 2026-01-20 14:26:55.784 225859 DEBUG oslo.service.loopingcall [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:26:55 np0005588919 nova_compute[225855]: 2026-01-20 14:26:55.784 225859 DEBUG nova.compute.manager [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:26:55 np0005588919 nova_compute[225855]: 2026-01-20 14:26:55.784 225859 DEBUG nova.network.neutron [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.036 225859 DEBUG nova.network.neutron [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.050 225859 DEBUG nova.network.neutron [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.065 225859 INFO nova.compute.manager [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Took 0.28 seconds to deallocate network for instance.#033[00m
Jan 20 09:26:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:56.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.190 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.191 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.273 225859 DEBUG oslo_concurrency.processutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:26:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:56.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:26:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3603524033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.763 225859 DEBUG oslo_concurrency.processutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.772 225859 DEBUG nova.compute.provider_tree [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.800 225859 DEBUG nova.scheduler.client.report [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.831 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.873 225859 INFO nova.scheduler.client.report [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Deleted allocations for instance 6091ab6e-2530-4b48-b482-00867d3c66c5#033[00m
Jan 20 09:26:56 np0005588919 nova_compute[225855]: 2026-01-20 14:26:56.943 225859 DEBUG oslo_concurrency.lockutils [None req-820fdf5b-ca9d-4022-90bb-c341c8e998c7 8ea9f3cd2cbb462a8ecbb488e6a1a25d 14ebcff06a484899a9725832f1eddfdf - - default default] Lock "6091ab6e-2530-4b48-b482-00867d3c66c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:26:58 np0005588919 podman[234155]: 2026-01-20 14:26:58.121439915 +0000 UTC m=+0.144404211 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:26:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:26:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:26:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:26:58 np0005588919 nova_compute[225855]: 2026-01-20 14:26:58.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:26:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:26:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:26:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:26:58.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:26:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:00 np0005588919 nova_compute[225855]: 2026-01-20 14:27:00.354 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:00.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:02.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:02.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:03 np0005588919 nova_compute[225855]: 2026-01-20 14:27:03.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:04.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:04.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:05 np0005588919 nova_compute[225855]: 2026-01-20 14:27:05.356 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:06.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:06.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:07 np0005588919 nova_compute[225855]: 2026-01-20 14:27:07.042 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919212.0404408, 6091ab6e-2530-4b48-b482-00867d3c66c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:07 np0005588919 nova_compute[225855]: 2026-01-20 14:27:07.042 225859 INFO nova.compute.manager [-] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:27:07 np0005588919 nova_compute[225855]: 2026-01-20 14:27:07.068 225859 DEBUG nova.compute.manager [None req-5d44fdcd-e341-4f55-8c93-deb2404ba457 - - - - - -] [instance: 6091ab6e-2530-4b48-b482-00867d3c66c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:08.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:08 np0005588919 nova_compute[225855]: 2026-01-20 14:27:08.291 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:08.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:09 np0005588919 podman[234232]: 2026-01-20 14:27:09.017687352 +0000 UTC m=+0.064089943 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 09:27:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:10.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:10 np0005588919 nova_compute[225855]: 2026-01-20 14:27:10.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:10 np0005588919 nova_compute[225855]: 2026-01-20 14:27:10.360 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:10.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:12.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:12.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.293 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.362 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:27:13 np0005588919 nova_compute[225855]: 2026-01-20 14:27:13.362 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:27:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3830522030' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:27:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:27:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3830522030' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:27:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:14.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:14.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.380 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.380 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.380 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.380 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.381 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1596450767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:15 np0005588919 nova_compute[225855]: 2026-01-20 14:27:15.803 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.017 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.018 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4804MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.081 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.082 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.097 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:16.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:16.387 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:16.387 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:16.387 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3796316152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.561 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.567 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.580 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.601 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:27:16 np0005588919 nova_compute[225855]: 2026-01-20 14:27:16.601 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:16.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:17 np0005588919 nova_compute[225855]: 2026-01-20 14:27:17.601 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:17 np0005588919 nova_compute[225855]: 2026-01-20 14:27:17.602 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:27:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:18.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:18 np0005588919 nova_compute[225855]: 2026-01-20 14:27:18.295 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:18 np0005588919 nova_compute[225855]: 2026-01-20 14:27:18.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:18 np0005588919 nova_compute[225855]: 2026-01-20 14:27:18.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:27:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:20.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:20 np0005588919 nova_compute[225855]: 2026-01-20 14:27:20.366 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:20.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:22.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:27:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 43K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 3116 syncs, 3.59 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4724 writes, 17K keys, 4724 commit groups, 1.0 writes per commit group, ingest: 18.16 MB, 0.03 MB/s#012Interval WAL: 4724 writes, 1965 syncs, 2.40 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:27:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:22.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:23 np0005588919 nova_compute[225855]: 2026-01-20 14:27:23.297 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:24.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:24.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:25 np0005588919 nova_compute[225855]: 2026-01-20 14:27:25.369 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:26.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:27:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:27:27 np0005588919 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 20 09:27:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.414 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.415 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.436 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.522 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.523 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.536 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.537 225859 INFO nova.compute.claims [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:27:28 np0005588919 nova_compute[225855]: 2026-01-20 14:27:28.651 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:28.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:29 np0005588919 podman[234379]: 2026-01-20 14:27:29.052013194 +0000 UTC m=+0.094199279 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:27:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:29 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/819079954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.091 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.099 225859 DEBUG nova.compute.provider_tree [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.198 225859 DEBUG nova.scheduler.client.report [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.273 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.274 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.398 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.398 225859 DEBUG nova.network.neutron [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.419 225859 INFO nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.443 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.531 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.533 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.534 225859 INFO nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Creating image(s)#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.569 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.596 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.627 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.630 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.686 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.687 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.688 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.688 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.717 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.720 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.850 225859 DEBUG nova.network.neutron [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:27:29 np0005588919 nova_compute[225855]: 2026-01-20 14:27:29.851 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:27:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:30.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:30 np0005588919 nova_compute[225855]: 2026-01-20 14:27:30.372 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:30.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:32 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:32Z|00078|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 20 09:27:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:32.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:32 np0005588919 nova_compute[225855]: 2026-01-20 14:27:32.686 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.966s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:32 np0005588919 nova_compute[225855]: 2026-01-20 14:27:32.757 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] resizing rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:27:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:32.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:32 np0005588919 nova_compute[225855]: 2026-01-20 14:27:32.870 225859 DEBUG nova.objects.instance [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'migration_context' on Instance uuid 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.115 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.116 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Ensure instance console log exists: /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.116 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.117 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.117 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.119 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.123 225859 WARNING nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.128 225859 DEBUG nova.virt.libvirt.host [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.128 225859 DEBUG nova.virt.libvirt.host [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.132 225859 DEBUG nova.virt.libvirt.host [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.132 225859 DEBUG nova.virt.libvirt.host [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.134 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.134 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.135 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.135 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.136 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.136 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.137 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.137 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.137 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.138 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.138 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.139 225859 DEBUG nova.virt.hardware [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.143 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.659 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.688 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:33 np0005588919 nova_compute[225855]: 2026-01-20 14:27:33.692 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:27:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1353817912' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.121 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.125 225859 DEBUG nova.objects.instance [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.140 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <uuid>11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef</uuid>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <name>instance-00000013</name>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:name>tempest-LiveMigrationNegativeTest-server-156396385</nova:name>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:27:33</nova:creationTime>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:user uuid="399cc9abe2cd4ab196a4e5789992ae51">tempest-LiveMigrationNegativeTest-1807701797-project-member</nova:user>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <nova:project uuid="1759b9d61ad946b6afa3e8448ce02190">tempest-LiveMigrationNegativeTest-1807701797</nova:project>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="serial">11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="uuid">11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/console.log" append="off"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:27:34 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:27:34 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:27:34 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:27:34 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:27:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.204 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.204 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.204 225859 INFO nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Using config drive#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.225 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.622 225859 INFO nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Creating config drive at /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.632 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplblpws97 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.777 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplblpws97" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:34.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.824 225859 DEBUG nova.storage.rbd_utils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] rbd image 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:27:34 np0005588919 nova_compute[225855]: 2026-01-20 14:27:34.830 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:35 np0005588919 nova_compute[225855]: 2026-01-20 14:27:35.375 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:36.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.304 225859 DEBUG oslo_concurrency.processutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.305 225859 INFO nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Deleting local config drive /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef/disk.config because it was imported into RBD.#033[00m
Jan 20 09:27:36 np0005588919 systemd-machined[194361]: New machine qemu-8-instance-00000013.
Jan 20 09:27:36 np0005588919 systemd[1]: Started Virtual Machine qemu-8-instance-00000013.
Jan 20 09:27:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:36.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.850 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919256.8501842, 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.852 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.855 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.855 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.859 225859 INFO nova.virt.libvirt.driver [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance spawned successfully.#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.860 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.885 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.894 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.899 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.900 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.900 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.901 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.902 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.903 225859 DEBUG nova.virt.libvirt.driver [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.950 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.951 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919256.8517892, 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.951 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] VM Started (Lifecycle Event)#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.996 225859 INFO nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Took 7.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:27:36 np0005588919 nova_compute[225855]: 2026-01-20 14:27:36.997 225859 DEBUG nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:37 np0005588919 nova_compute[225855]: 2026-01-20 14:27:37.007 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:37 np0005588919 nova_compute[225855]: 2026-01-20 14:27:37.010 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:37 np0005588919 nova_compute[225855]: 2026-01-20 14:27:37.052 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:27:37 np0005588919 nova_compute[225855]: 2026-01-20 14:27:37.074 225859 INFO nova.compute.manager [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Took 8.58 seconds to build instance.#033[00m
Jan 20 09:27:37 np0005588919 nova_compute[225855]: 2026-01-20 14:27:37.096 225859 DEBUG oslo_concurrency.lockutils [None req-76b51570-913e-4765-a697-65fe21bb2b2d 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 20 09:27:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:38.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.247 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Creating tmpfile /var/lib/nova/instances/tmpuyxcxf2_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.248 225859 DEBUG nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuyxcxf2_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.365 225859 DEBUG nova.objects.instance [None req-8a9be0b3-2f06-48f2-873b-2d853aae1721 8bb376f888a54e9d8ed785e1b46d4fe5 7e22d2ad7d84451d89e5288d170589b5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.403 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919258.4035904, 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.404 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.443 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.450 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.478 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 09:27:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:38.572 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:27:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:38.572 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:27:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:38.573 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.619 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:38 np0005588919 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 20 09:27:38 np0005588919 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000013.scope: Consumed 1.933s CPU time.
Jan 20 09:27:38 np0005588919 systemd-machined[194361]: Machine qemu-8-instance-00000013 terminated.
Jan 20 09:27:38 np0005588919 nova_compute[225855]: 2026-01-20 14:27:38.752 225859 DEBUG nova.compute.manager [None req-8a9be0b3-2f06-48f2-873b-2d853aae1721 8bb376f888a54e9d8ed785e1b46d4fe5 7e22d2ad7d84451d89e5288d170589b5 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:38.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:39 np0005588919 nova_compute[225855]: 2026-01-20 14:27:39.604 225859 DEBUG nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuyxcxf2_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d726266f-b9a6-406b-ad13-f9db3e0dc6aa',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 20 09:27:39 np0005588919 nova_compute[225855]: 2026-01-20 14:27:39.638 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:39 np0005588919 nova_compute[225855]: 2026-01-20 14:27:39.638 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquired lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:39 np0005588919 nova_compute[225855]: 2026-01-20 14:27:39.639 225859 DEBUG nova.network.neutron [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:40 np0005588919 podman[234811]: 2026-01-20 14:27:40.010346977 +0000 UTC m=+0.061015107 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:27:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.717 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.719 225859 INFO nova.compute.manager [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Terminating instance#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.720 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "refresh_cache-11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.721 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquired lock "refresh_cache-11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.721 225859 DEBUG nova.network.neutron [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:40.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.833 225859 DEBUG nova.network.neutron [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Updating instance_info_cache with network_info: [{"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.855 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Releasing lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.856 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuyxcxf2_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d726266f-b9a6-406b-ad13-f9db3e0dc6aa',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.857 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Creating instance directory: /var/lib/nova/instances/d726266f-b9a6-406b-ad13-f9db3e0dc6aa pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.857 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Ensure instance console log exists: /var/lib/nova/instances/d726266f-b9a6-406b-ad13-f9db3e0dc6aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.857 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.858 225859 DEBUG nova.virt.libvirt.vif [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:27:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1394818615',display_name='tempest-LiveMigrationTest-server-1394818615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1394818615',id=16,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:27:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d15f60b9e48e4175b5520d1e57ed2d3a',ramdisk_id='',reservation_id='r-pti072hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-864280704',owner_user_name='tempest-LiveMigrationTest-864280704-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:27:33Z,user_data=None,user_id='bce7fcbd19554e29bb80c5b93b7dd3c9',uuid=d726266f-b9a6-406b-ad13-f9db3e0dc6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.858 225859 DEBUG nova.network.os_vif_util [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converting VIF {"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.859 225859 DEBUG nova.network.os_vif_util [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.859 225859 DEBUG os_vif [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.860 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.861 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.864 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.864 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6067076-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.864 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6067076-0f, col_values=(('external_ids', {'iface-id': 'e6067076-0f97-4e9c-9355-353277570e11', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:cf:b7', 'vm-uuid': 'd726266f-b9a6-406b-ad13-f9db3e0dc6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.866 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:40 np0005588919 NetworkManager[49104]: <info>  [1768919260.8676] manager: (tape6067076-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.869 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.873 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.874 225859 INFO os_vif [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f')#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.874 225859 DEBUG nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.875 225859 DEBUG nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuyxcxf2_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d726266f-b9a6-406b-ad13-f9db3e0dc6aa',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 20 09:27:40 np0005588919 nova_compute[225855]: 2026-01-20 14:27:40.928 225859 DEBUG nova.network.neutron [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:41 np0005588919 nova_compute[225855]: 2026-01-20 14:27:41.477 225859 DEBUG nova.network.neutron [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:41 np0005588919 nova_compute[225855]: 2026-01-20 14:27:41.497 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Releasing lock "refresh_cache-11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:41 np0005588919 nova_compute[225855]: 2026-01-20 14:27:41.497 225859 DEBUG nova.compute.manager [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:27:41 np0005588919 nova_compute[225855]: 2026-01-20 14:27:41.505 225859 INFO nova.virt.libvirt.driver [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance destroyed successfully.#033[00m
Jan 20 09:27:41 np0005588919 nova_compute[225855]: 2026-01-20 14:27:41.505 225859 DEBUG nova.objects.instance [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lazy-loading 'resources' on Instance uuid 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:27:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:42.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.489 225859 DEBUG nova.network.neutron [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Port e6067076-0f97-4e9c-9355-353277570e11 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.491 225859 DEBUG nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuyxcxf2_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d726266f-b9a6-406b-ad13-f9db3e0dc6aa',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.519 225859 INFO nova.virt.libvirt.driver [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Deleting instance files /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_del#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.520 225859 INFO nova.virt.libvirt.driver [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Deletion of /var/lib/nova/instances/11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef_del complete#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.616 225859 INFO nova.compute.manager [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Took 1.12 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.616 225859 DEBUG oslo.service.loopingcall [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.617 225859 DEBUG nova.compute.manager [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.617 225859 DEBUG nova.network.neutron [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:27:42 np0005588919 systemd[1]: Starting libvirt proxy daemon...
Jan 20 09:27:42 np0005588919 systemd[1]: Started libvirt proxy daemon.
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.758 225859 DEBUG nova.network.neutron [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.770 225859 DEBUG nova.network.neutron [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.800 225859 INFO nova.compute.manager [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Took 0.18 seconds to deallocate network for instance.#033[00m
Jan 20 09:27:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:42.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:42 np0005588919 NetworkManager[49104]: <info>  [1768919262.8301] manager: (tape6067076-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 20 09:27:42 np0005588919 kernel: tape6067076-0f: entered promiscuous mode
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.832 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:42Z|00079|binding|INFO|Claiming lport e6067076-0f97-4e9c-9355-353277570e11 for this additional chassis.
Jan 20 09:27:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:42Z|00080|binding|INFO|e6067076-0f97-4e9c-9355-353277570e11: Claiming fa:16:3e:db:cf:b7 10.100.0.12
Jan 20 09:27:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:42Z|00081|binding|INFO|Claiming lport 9013ed66-b0f2-4a83-b7d4-572f1324f582 for this additional chassis.
Jan 20 09:27:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:42Z|00082|binding|INFO|9013ed66-b0f2-4a83-b7d4-572f1324f582: Claiming fa:16:3e:51:74:79 19.80.0.125
Jan 20 09:27:42 np0005588919 systemd-machined[194361]: New machine qemu-9-instance-00000010.
Jan 20 09:27:42 np0005588919 systemd[1]: Started Virtual Machine qemu-9-instance-00000010.
Jan 20 09:27:42 np0005588919 systemd-udevd[234886]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:27:42 np0005588919 NetworkManager[49104]: <info>  [1768919262.9043] device (tape6067076-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:27:42 np0005588919 NetworkManager[49104]: <info>  [1768919262.9052] device (tape6067076-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.910 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.911 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.970 225859 DEBUG oslo_concurrency.processutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:27:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:42Z|00083|binding|INFO|Setting lport e6067076-0f97-4e9c-9355-353277570e11 ovn-installed in OVS
Jan 20 09:27:42 np0005588919 nova_compute[225855]: 2026-01-20 14:27:42.990 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:27:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1307297049' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.386 225859 DEBUG oslo_concurrency.processutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.391 225859 DEBUG nova.compute.provider_tree [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.485 225859 DEBUG nova.scheduler.client.report [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.504 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.600 225859 INFO nova.scheduler.client.report [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Deleted allocations for instance 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.681 225859 DEBUG oslo_concurrency.lockutils [None req-f50a58a7-ab3e-4402-8521-8f3921fdc37c 399cc9abe2cd4ab196a4e5789992ae51 1759b9d61ad946b6afa3e8448ce02190 - - default default] Lock "11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.953 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919263.9528074, d726266f-b9a6-406b-ad13-f9db3e0dc6aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.954 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] VM Started (Lifecycle Event)#033[00m
Jan 20 09:27:43 np0005588919 nova_compute[225855]: 2026-01-20 14:27:43.993 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:44.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:44 np0005588919 nova_compute[225855]: 2026-01-20 14:27:44.473 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919264.4732022, d726266f-b9a6-406b-ad13-f9db3e0dc6aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:44 np0005588919 nova_compute[225855]: 2026-01-20 14:27:44.473 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:27:44 np0005588919 nova_compute[225855]: 2026-01-20 14:27:44.505 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:44 np0005588919 nova_compute[225855]: 2026-01-20 14:27:44.509 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:27:44 np0005588919 nova_compute[225855]: 2026-01-20 14:27:44.535 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 20 09:27:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:44.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.729777804 +0000 UTC m=+0.042157946 container create d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:27:45 np0005588919 systemd[1]: Started libpod-conmon-d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162.scope.
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.707532209 +0000 UTC m=+0.019912361 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:27:45 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.850984872 +0000 UTC m=+0.163365034 container init d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.860815628 +0000 UTC m=+0.173195790 container start d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.865275253 +0000 UTC m=+0.177655385 container attach d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:27:45 np0005588919 stupefied_zhukovsky[235248]: 167 167
Jan 20 09:27:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:45 np0005588919 systemd[1]: libpod-d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162.scope: Deactivated successfully.
Jan 20 09:27:45 np0005588919 conmon[235248]: conmon d249f2af06070d59eb73 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162.scope/container/memory.events
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.89645316 +0000 UTC m=+0.208833302 container died d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 20 09:27:45 np0005588919 nova_compute[225855]: 2026-01-20 14:27:45.894 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:45 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a0c0d2345ad9421be60a9a53d44f9eb6df75f1b1da909f519c3d49fe170898ac-merged.mount: Deactivated successfully.
Jan 20 09:27:45 np0005588919 podman[235232]: 2026-01-20 14:27:45.935033305 +0000 UTC m=+0.247413417 container remove d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_zhukovsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:27:45 np0005588919 systemd[1]: libpod-conmon-d249f2af06070d59eb73cb3ade189c82d70acd6cfd3cafd5b501e45ff0069162.scope: Deactivated successfully.
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00084|binding|INFO|Claiming lport e6067076-0f97-4e9c-9355-353277570e11 for this chassis.
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00085|binding|INFO|e6067076-0f97-4e9c-9355-353277570e11: Claiming fa:16:3e:db:cf:b7 10.100.0.12
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00086|binding|INFO|Claiming lport 9013ed66-b0f2-4a83-b7d4-572f1324f582 for this chassis.
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00087|binding|INFO|9013ed66-b0f2-4a83-b7d4-572f1324f582: Claiming fa:16:3e:51:74:79 19.80.0.125
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00088|binding|INFO|Setting lport e6067076-0f97-4e9c-9355-353277570e11 up in Southbound
Jan 20 09:27:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:45Z|00089|binding|INFO|Setting lport 9013ed66-b0f2-4a83-b7d4-572f1324f582 up in Southbound
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.004 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:74:79 19.80.0.125'], port_security=['fa:16:3e:51:74:79 19.80.0.125'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['e6067076-0f97-4e9c-9355-353277570e11'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1871336558', 'neutron:cidrs': '19.80.0.125/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08e625c5-899c-442a-8ef4-9a3c96892de4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1871336558', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=62d5dc3b-a6a9-4e55-8632-5a7fe1112862, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9013ed66-b0f2-4a83-b7d4-572f1324f582) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.006 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:cf:b7 10.100.0.12'], port_security=['fa:16:3e:db:cf:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-395006048', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd726266f-b9a6-406b-ad13-f9db3e0dc6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14f18b27-1594-48d8-a08b-a930f7adbc08', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-395006048', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02983c41-bbec-48cf-910a-84fed1be783f, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e6067076-0f97-4e9c-9355-353277570e11) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.007 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9013ed66-b0f2-4a83-b7d4-572f1324f582 in datapath 08e625c5-899c-442a-8ef4-9a3c96892de4 bound to our chassis#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.008 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08e625c5-899c-442a-8ef4-9a3c96892de4#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[17752f3f-9d97-444b-841c-c5b42ad46cdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.022 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08e625c5-81 in ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.023 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08e625c5-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.023 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b74a80c1-ff6f-4aeb-94f4-b1779fa03743]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.024 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c9934431-9d10-4c3a-87d7-aec4ca04bc86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.039 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9c8bfe-895e-4d2b-9a1f-9283e9b9b066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.063 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc626cf-f716-4935-92f9-98191a0cd26b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.114 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc97b7e-8704-4bef-91c6-e05202d02d43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.127 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[11ac09c5-bf54-4b86-9c05-f3fbb7eb03db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 podman[235273]: 2026-01-20 14:27:46.128682869 +0000 UTC m=+0.066379187 container create 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3)
Jan 20 09:27:46 np0005588919 NetworkManager[49104]: <info>  [1768919266.1290] manager: (tap08e625c5-80): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.156 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d6df16a5-4681-48e9-87fd-eedaf84d584f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.160 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bdf150-acd5-41c8-8cd0-b09947d1f8da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 systemd-udevd[235291]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:27:46 np0005588919 systemd[1]: Started libpod-conmon-723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c.scope.
Jan 20 09:27:46 np0005588919 NetworkManager[49104]: <info>  [1768919266.1904] device (tap08e625c5-80): carrier: link connected
Jan 20 09:27:46 np0005588919 podman[235273]: 2026-01-20 14:27:46.100446535 +0000 UTC m=+0.038142943 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.196 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b97c2a46-3a82-4955-b039-34f727dc9287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:46.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:46 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:27:46 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4b00fa219391a75310795fc48774642fdbf2769a450a9b244613588abfb8d5e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:46 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4b00fa219391a75310795fc48774642fdbf2769a450a9b244613588abfb8d5e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:46 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4b00fa219391a75310795fc48774642fdbf2769a450a9b244613588abfb8d5e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.215 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8c9c11-cc1b-4a6b-b5a1-d481091358fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08e625c5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:55:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430511, 'reachable_time': 44959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235313, 'error': None, 'target': 'ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4b00fa219391a75310795fc48774642fdbf2769a450a9b244613588abfb8d5e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.233 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0caada-ebbc-4a21-8a29-e892f0715b47]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:5580'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430511, 'tstamp': 430511}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235314, 'error': None, 'target': 'ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 podman[235273]: 2026-01-20 14:27:46.235697888 +0000 UTC m=+0.173394206 container init 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:27:46 np0005588919 podman[235273]: 2026-01-20 14:27:46.243488447 +0000 UTC m=+0.181184775 container start 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:27:46 np0005588919 podman[235273]: 2026-01-20 14:27:46.246383348 +0000 UTC m=+0.184079696 container attach 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.251 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b5df8caa-c13d-4825-94b7-27194b07a89f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08e625c5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:55:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430511, 'reachable_time': 44959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235318, 'error': None, 'target': 'ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.282 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cffab6d3-8f39-474b-afbc-2c6bab84b34c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.337 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[21dced7d-19e7-4cc8-b699-b21db43a882c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.338 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08e625c5-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.338 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.339 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08e625c5-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:46 np0005588919 NetworkManager[49104]: <info>  [1768919266.3411] manager: (tap08e625c5-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 20 09:27:46 np0005588919 kernel: tap08e625c5-80: entered promiscuous mode
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.343 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08e625c5-80, col_values=(('external_ids', {'iface-id': 'e10f34be-dfc1-4bfe-806f-f00a84c17390'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:46 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:46Z|00090|binding|INFO|Releasing lport e10f34be-dfc1-4bfe-806f-f00a84c17390 from this chassis (sb_readonly=0)
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.344 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.358 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.359 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08e625c5-899c-442a-8ef4-9a3c96892de4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08e625c5-899c-442a-8ef4-9a3c96892de4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.360 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c1c1ac-b8e6-48ca-98a6-4ccf412371d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.361 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-08e625c5-899c-442a-8ef4-9a3c96892de4
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/08e625c5-899c-442a-8ef4-9a3c96892de4.pid.haproxy
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 08e625c5-899c-442a-8ef4-9a3c96892de4
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.361 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4', 'env', 'PROCESS_TAG=haproxy-08e625c5-899c-442a-8ef4-9a3c96892de4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08e625c5-899c-442a-8ef4-9a3c96892de4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.459 225859 INFO nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Post operation of migration started#033[00m
Jan 20 09:27:46 np0005588919 podman[235352]: 2026-01-20 14:27:46.776040388 +0000 UTC m=+0.052108316 container create ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:27:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:46.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:46 np0005588919 systemd[1]: Started libpod-conmon-ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395.scope.
Jan 20 09:27:46 np0005588919 podman[235352]: 2026-01-20 14:27:46.746627371 +0000 UTC m=+0.022695329 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:27:46 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:27:46 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/693c57f2e466814459bb967b5b8379f5bf0326b3ec16540677e4a65e7bbf1a2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:46 np0005588919 podman[235352]: 2026-01-20 14:27:46.863663231 +0000 UTC m=+0.139731179 container init ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:27:46 np0005588919 podman[235352]: 2026-01-20 14:27:46.869166226 +0000 UTC m=+0.145234144 container start ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:27:46 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [NOTICE]   (235372) : New worker (235374) forked
Jan 20 09:27:46 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [NOTICE]   (235372) : Loading success.
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.888 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.888 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquired lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:27:46 np0005588919 nova_compute[225855]: 2026-01-20 14:27:46.888 225859 DEBUG nova.network.neutron [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.921 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e6067076-0f97-4e9c-9355-353277570e11 in datapath 14f18b27-1594-48d8-a08b-a930f7adbc08 unbound from our chassis#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.923 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14f18b27-1594-48d8-a08b-a930f7adbc08#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.933 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a602b218-6c2b-4dee-974b-4cbe0100eea9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.934 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14f18b27-11 in ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.936 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14f18b27-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.936 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[12f46e0d-43a8-4a32-9684-5e7141236665]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.937 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[73ce1273-9487-446b-82f5-06960f9b542e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.950 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c0a386-76f5-4766-8fe2-91e3dd41a930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.971 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb1d067-2a98-4312-8838-9260d5aaa963]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:46.999 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[69f37fd0-5fd9-4b5d-890a-1f2c7f41053c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.006 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6e609cf7-4fde-4e41-bc38-4086adca8c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 NetworkManager[49104]: <info>  [1768919267.0070] manager: (tap14f18b27-10): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Jan 20 09:27:47 np0005588919 systemd-udevd[235311]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.037 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[19be65da-fd9d-4490-a761-98a607397c7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.039 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc0f149-82c4-4a6d-b064-4445b4822f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 NetworkManager[49104]: <info>  [1768919267.0650] device (tap14f18b27-10): carrier: link connected
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.071 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[753fdee6-83e3-42d4-b377-3d8ddd3d4bd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.087 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b1cb4a-e006-4b41-8f26-5994ff41fc5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14f18b27-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:1f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430599, 'reachable_time': 25641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235401, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.104 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ef7a9f-3c82-44b3-8a12-b69b45d9f4f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7d:1f17'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430599, 'tstamp': 430599}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235404, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.120 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d28047cc-2d18-45fc-bf27-08a6d3f77323]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14f18b27-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:1f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430599, 'reachable_time': 25641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235407, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:47Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:cf:b7 10.100.0.12
Jan 20 09:27:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:47Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:cf:b7 10.100.0.12
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.149 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[920fe278-cf87-4353-ba1c-9b304682c841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.204 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe733e1-fc7d-4a76-812a-1e519ee7966e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f18b27-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.206 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14f18b27-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:47 np0005588919 nova_compute[225855]: 2026-01-20 14:27:47.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588919 NetworkManager[49104]: <info>  [1768919267.2545] manager: (tap14f18b27-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 20 09:27:47 np0005588919 kernel: tap14f18b27-10: entered promiscuous mode
Jan 20 09:27:47 np0005588919 nova_compute[225855]: 2026-01-20 14:27:47.257 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.259 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14f18b27-10, col_values=(('external_ids', {'iface-id': 'aa1c73c5-9761-4457-acdc-9f93220f739f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:27:47 np0005588919 nova_compute[225855]: 2026-01-20 14:27:47.260 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:27:47Z|00091|binding|INFO|Releasing lport aa1c73c5-9761-4457-acdc-9f93220f739f from this chassis (sb_readonly=0)
Jan 20 09:27:47 np0005588919 nova_compute[225855]: 2026-01-20 14:27:47.260 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.262 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14f18b27-1594-48d8-a08b-a930f7adbc08.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14f18b27-1594-48d8-a08b-a930f7adbc08.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.263 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ba307b89-be5c-4bb7-828c-ba293ead4064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.264 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-14f18b27-1594-48d8-a08b-a930f7adbc08
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/14f18b27-1594-48d8-a08b-a930f7adbc08.pid.haproxy
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 14f18b27-1594-48d8-a08b-a930f7adbc08
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:27:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:27:47.265 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'env', 'PROCESS_TAG=haproxy-14f18b27-1594-48d8-a08b-a930f7adbc08', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14f18b27-1594-48d8-a08b-a930f7adbc08.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:27:47 np0005588919 nova_compute[225855]: 2026-01-20 14:27:47.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]: [
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:    {
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "available": false,
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "ceph_device": false,
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "lsm_data": {},
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "lvs": [],
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "path": "/dev/sr0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "rejected_reasons": [
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "Insufficient space (<5GB)",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "Has a FileSystem"
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        ],
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        "sys_api": {
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "actuators": null,
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "device_nodes": "sr0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "devname": "sr0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "human_readable_size": "482.00 KB",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "id_bus": "ata",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "model": "QEMU DVD-ROM",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "nr_requests": "2",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "parent": "/dev/sr0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "partitions": {},
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "path": "/dev/sr0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "removable": "1",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "rev": "2.5+",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "ro": "0",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "rotational": "1",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "sas_address": "",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "sas_device_handle": "",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "scheduler_mode": "mq-deadline",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "sectors": 0,
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "sectorsize": "2048",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "size": 493568.0,
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "support_discard": "2048",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "type": "disk",
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:            "vendor": "QEMU"
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:        }
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]:    }
Jan 20 09:27:47 np0005588919 crazy_jemison[235295]: ]
Jan 20 09:27:47 np0005588919 systemd[1]: libpod-723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c.scope: Deactivated successfully.
Jan 20 09:27:47 np0005588919 systemd[1]: libpod-723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c.scope: Consumed 1.147s CPU time.
Jan 20 09:27:47 np0005588919 podman[235273]: 2026-01-20 14:27:47.430253051 +0000 UTC m=+1.367949369 container died 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:27:47 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e4b00fa219391a75310795fc48774642fdbf2769a450a9b244613588abfb8d5e-merged.mount: Deactivated successfully.
Jan 20 09:27:47 np0005588919 podman[235273]: 2026-01-20 14:27:47.498989813 +0000 UTC m=+1.436686141 container remove 723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_jemison, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef)
Jan 20 09:27:47 np0005588919 systemd[1]: libpod-conmon-723b5cbabfbdb2954ad39dd5ca33954a38d3882430cd9f04424d216072c7ef7c.scope: Deactivated successfully.
Jan 20 09:27:47 np0005588919 podman[236624]: 2026-01-20 14:27:47.612567786 +0000 UTC m=+0.043717840 container create cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:27:47 np0005588919 systemd[1]: Started libpod-conmon-cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042.scope.
Jan 20 09:27:47 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:27:47 np0005588919 podman[236624]: 2026-01-20 14:27:47.591102803 +0000 UTC m=+0.022252867 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:27:47 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2676fe810d64d27697c2b84fe44f9ab65fef13401e606b1abe81b75e60236b87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:27:47 np0005588919 podman[236624]: 2026-01-20 14:27:47.698389689 +0000 UTC m=+0.129539753 container init cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:27:47 np0005588919 podman[236624]: 2026-01-20 14:27:47.705004405 +0000 UTC m=+0.136154449 container start cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:27:47 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [NOTICE]   (236644) : New worker (236646) forked
Jan 20 09:27:47 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [NOTICE]   (236644) : Loading success.
Jan 20 09:27:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:48.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:48 np0005588919 nova_compute[225855]: 2026-01-20 14:27:48.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:27:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:48.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.914 225859 DEBUG nova.network.neutron [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Updating instance_info_cache with network_info: [{"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.936 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Releasing lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.960 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.960 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.961 225859 DEBUG oslo_concurrency.lockutils [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:27:49 np0005588919 nova_compute[225855]: 2026-01-20 14:27:49.965 225859 INFO nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 20 09:27:49 np0005588919 virtqemud[225396]: Domain id=9 name='instance-00000010' uuid=d726266f-b9a6-406b-ad13-f9db3e0dc6aa is tainted: custom-monitor
Jan 20 09:27:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:50.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:50.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:50 np0005588919 nova_compute[225855]: 2026-01-20 14:27:50.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:50 np0005588919 nova_compute[225855]: 2026-01-20 14:27:50.972 225859 INFO nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 20 09:27:51 np0005588919 nova_compute[225855]: 2026-01-20 14:27:51.978 225859 INFO nova.virt.libvirt.driver [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 20 09:27:51 np0005588919 nova_compute[225855]: 2026-01-20 14:27:51.982 225859 DEBUG nova.compute.manager [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:52 np0005588919 nova_compute[225855]: 2026-01-20 14:27:52.004 225859 DEBUG nova.objects.instance [None req-466ac999-dbbf-45a2-9a60-0b686a463dfc f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:27:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:52.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:52.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:53 np0005588919 nova_compute[225855]: 2026-01-20 14:27:53.353 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:53 np0005588919 nova_compute[225855]: 2026-01-20 14:27:53.754 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919258.7535377, 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:27:53 np0005588919 nova_compute[225855]: 2026-01-20 14:27:53.755 225859 INFO nova.compute.manager [-] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:27:53 np0005588919 nova_compute[225855]: 2026-01-20 14:27:53.773 225859 DEBUG nova.compute.manager [None req-725642af-64a3-46f6-b583-d2ee9bc768f3 - - - - - -] [instance: 11d7ff3d-7e9c-4ecb-86b5-ffb0e3e62bef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:27:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:54.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:27:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5036 writes, 26K keys, 5036 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 5036 writes, 5036 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1541 writes, 7455 keys, 1541 commit groups, 1.0 writes per commit group, ingest: 15.91 MB, 0.03 MB/s#012Interval WAL: 1541 writes, 1541 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     64.4      0.46              0.12        14    0.033       0      0       0.0       0.0#012  L6      1/0    8.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    107.7     89.6      1.17              0.39        13    0.090     61K   6800       0.0       0.0#012 Sum      1/0    8.65 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     77.2     82.5      1.63              0.51        27    0.060     61K   6800       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.2     98.6    101.2      0.50              0.18        10    0.050     26K   2531       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    107.7     89.6      1.17              0.39        13    0.090     61K   6800       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     64.7      0.46              0.12        13    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.029, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.13 GB write, 0.07 MB/s write, 0.12 GB read, 0.07 MB/s read, 1.6 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 12.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000107 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(713,11.98 MB,3.94008%) FilterBlock(27,179.42 KB,0.0576371%) IndexBlock(27,325.92 KB,0.104698%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:27:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:27:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:54.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:55 np0005588919 nova_compute[225855]: 2026-01-20 14:27:55.903 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:56.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:27:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:56.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:27:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:27:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:27:58.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:58 np0005588919 nova_compute[225855]: 2026-01-20 14:27:58.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:27:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:27:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:27:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:27:58.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:27:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:00 np0005588919 podman[236764]: 2026-01-20 14:28:00.063131651 +0000 UTC m=+0.108305365 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:28:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:00.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:00 np0005588919 nova_compute[225855]: 2026-01-20 14:28:00.904 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:02.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:02.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:03 np0005588919 nova_compute[225855]: 2026-01-20 14:28:03.358 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:04.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:04.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:05 np0005588919 nova_compute[225855]: 2026-01-20 14:28:05.908 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:05 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 20 09:28:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:06.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:06.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:08.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.361 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.387 225859 DEBUG nova.compute.manager [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.487 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.488 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.512 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'pci_requests' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.529 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.529 225859 INFO nova.compute.claims [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.530 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'resources' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.545 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'numa_topology' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.562 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'pci_devices' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.622 225859 INFO nova.compute.resource_tracker [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating resource usage from migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.622 225859 DEBUG nova.compute.resource_tracker [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Starting to track incoming migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb with flavor 522deaab-a741-4dbb-932d-d8b13a211c33 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:28:08 np0005588919 nova_compute[225855]: 2026-01-20 14:28:08.682 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:08.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4136608474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:09 np0005588919 nova_compute[225855]: 2026-01-20 14:28:09.108 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:09 np0005588919 nova_compute[225855]: 2026-01-20 14:28:09.114 225859 DEBUG nova.compute.provider_tree [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:09 np0005588919 nova_compute[225855]: 2026-01-20 14:28:09.138 225859 DEBUG nova.scheduler.client.report [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:09 np0005588919 nova_compute[225855]: 2026-01-20 14:28:09.163 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:09 np0005588919 nova_compute[225855]: 2026-01-20 14:28:09.164 225859 INFO nova.compute.manager [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Migrating#033[00m
Jan 20 09:28:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:10.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:10 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:28:10 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:28:10 np0005588919 systemd-logind[783]: New session 51 of user nova.
Jan 20 09:28:10 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:28:10 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:28:10 np0005588919 podman[236819]: 2026-01-20 14:28:10.475601425 +0000 UTC m=+0.072295613 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:28:10 np0005588919 systemd[236831]: Queued start job for default target Main User Target.
Jan 20 09:28:10 np0005588919 systemd[236831]: Created slice User Application Slice.
Jan 20 09:28:10 np0005588919 systemd[236831]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:28:10 np0005588919 systemd[236831]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:28:10 np0005588919 systemd[236831]: Reached target Paths.
Jan 20 09:28:10 np0005588919 systemd[236831]: Reached target Timers.
Jan 20 09:28:10 np0005588919 systemd[236831]: Starting D-Bus User Message Bus Socket...
Jan 20 09:28:10 np0005588919 systemd[236831]: Starting Create User's Volatile Files and Directories...
Jan 20 09:28:10 np0005588919 systemd[236831]: Finished Create User's Volatile Files and Directories.
Jan 20 09:28:10 np0005588919 systemd[236831]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:28:10 np0005588919 systemd[236831]: Reached target Sockets.
Jan 20 09:28:10 np0005588919 systemd[236831]: Reached target Basic System.
Jan 20 09:28:10 np0005588919 systemd[236831]: Reached target Main User Target.
Jan 20 09:28:10 np0005588919 systemd[236831]: Startup finished in 156ms.
Jan 20 09:28:10 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:28:10 np0005588919 systemd[1]: Started Session 51 of User nova.
Jan 20 09:28:10 np0005588919 systemd[1]: session-51.scope: Deactivated successfully.
Jan 20 09:28:10 np0005588919 systemd-logind[783]: Session 51 logged out. Waiting for processes to exit.
Jan 20 09:28:10 np0005588919 systemd-logind[783]: Removed session 51.
Jan 20 09:28:10 np0005588919 systemd-logind[783]: New session 53 of user nova.
Jan 20 09:28:10 np0005588919 systemd[1]: Started Session 53 of User nova.
Jan 20 09:28:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:10.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:10 np0005588919 nova_compute[225855]: 2026-01-20 14:28:10.950 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:10 np0005588919 systemd[1]: session-53.scope: Deactivated successfully.
Jan 20 09:28:10 np0005588919 systemd-logind[783]: Session 53 logged out. Waiting for processes to exit.
Jan 20 09:28:10 np0005588919 systemd-logind[783]: Removed session 53.
Jan 20 09:28:11 np0005588919 nova_compute[225855]: 2026-01-20 14:28:11.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:11 np0005588919 nova_compute[225855]: 2026-01-20 14:28:11.361 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:28:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:12.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:12 np0005588919 nova_compute[225855]: 2026-01-20 14:28:12.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:12 np0005588919 nova_compute[225855]: 2026-01-20 14:28:12.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:12 np0005588919 nova_compute[225855]: 2026-01-20 14:28:12.354 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:28:12 np0005588919 nova_compute[225855]: 2026-01-20 14:28:12.367 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:28:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:12.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:13 np0005588919 nova_compute[225855]: 2026-01-20 14:28:13.388 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:14.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.354 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.355 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.527 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.527 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.528 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:28:14 np0005588919 nova_compute[225855]: 2026-01-20 14:28:14.528 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d726266f-b9a6-406b-ad13-f9db3e0dc6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:15 np0005588919 nova_compute[225855]: 2026-01-20 14:28:15.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:16.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:16.387 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:16.388 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:16.389 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.560 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Updating instance_info_cache with network_info: [{"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.579 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-d726266f-b9a6-406b-ad13-f9db3e0dc6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.579 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.580 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.580 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.580 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.580 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.609 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.609 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.609 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.609 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:28:16 np0005588919 nova_compute[225855]: 2026-01-20 14:28:16.610 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:16.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2141615976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.076 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.150 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.151 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.374 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.375 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4547MB free_disk=20.87643814086914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.492 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Migration for instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.577 225859 INFO nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating resource usage from migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.578 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Starting to track incoming migration 285e048b-eb47-4e34-b0d7-f2b9b65cbadb with flavor 522deaab-a741-4dbb-932d-d8b13a211c33 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.736 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance d726266f-b9a6-406b-ad13-f9db3e0dc6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.755 225859 WARNING nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.755 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.755 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.837 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.867 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.868 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.903 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:28:17 np0005588919 nova_compute[225855]: 2026-01-20 14:28:17.933 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.035 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:18.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.391 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1146682809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.485 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.491 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.519 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.557 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:28:18 np0005588919 nova_compute[225855]: 2026-01-20 14:28:18.557 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:18.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:19 np0005588919 nova_compute[225855]: 2026-01-20 14:28:19.317 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588919 nova_compute[225855]: 2026-01-20 14:28:19.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588919 nova_compute[225855]: 2026-01-20 14:28:19.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:19 np0005588919 nova_compute[225855]: 2026-01-20 14:28:19.354 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:28:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:20.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:20 np0005588919 nova_compute[225855]: 2026-01-20 14:28:20.373 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:20.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:20 np0005588919 nova_compute[225855]: 2026-01-20 14:28:20.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:21 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:28:21 np0005588919 systemd[236831]: Activating special unit Exit the Session...
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped target Main User Target.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped target Basic System.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped target Paths.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped target Sockets.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped target Timers.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:28:21 np0005588919 systemd[236831]: Closed D-Bus User Message Bus Socket.
Jan 20 09:28:21 np0005588919 systemd[236831]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:28:21 np0005588919 systemd[236831]: Removed slice User Application Slice.
Jan 20 09:28:21 np0005588919 systemd[236831]: Reached target Shutdown.
Jan 20 09:28:21 np0005588919 systemd[236831]: Finished Exit the Session.
Jan 20 09:28:21 np0005588919 systemd[236831]: Reached target Exit the Session.
Jan 20 09:28:21 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:28:21 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:28:21 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:28:21 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:28:21 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:28:21 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:28:21 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:28:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:22.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:22.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:23 np0005588919 nova_compute[225855]: 2026-01-20 14:28:23.427 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:24.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:24 np0005588919 nova_compute[225855]: 2026-01-20 14:28:24.652 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:24 np0005588919 nova_compute[225855]: 2026-01-20 14:28:24.653 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:24 np0005588919 nova_compute[225855]: 2026-01-20 14:28:24.653 225859 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:24.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:24 np0005588919 nova_compute[225855]: 2026-01-20 14:28:24.967 225859 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.443 225859 DEBUG nova.network.neutron [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.462 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.580 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.581 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.581 225859 INFO nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Creating image(s)#033[00m
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.630 225859 DEBUG nova.storage.rbd_utils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] creating snapshot(nova-resize) on rbd image(29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:28:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 20 09:28:25 np0005588919 nova_compute[225855]: 2026-01-20 14:28:25.864 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.144 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.145 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Ensure instance console log exists: /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.145 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.146 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.146 225859 DEBUG oslo_concurrency.lockutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.148 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.152 225859 WARNING nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.156 225859 DEBUG nova.virt.libvirt.host [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.157 225859 DEBUG nova.virt.libvirt.host [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.161 225859 DEBUG nova.virt.libvirt.host [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.161 225859 DEBUG nova.virt.libvirt.host [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.162 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.163 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.163 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.164 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.164 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.164 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.165 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.165 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.165 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.166 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.166 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.166 225859 DEBUG nova.virt.hardware [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.166 225859 DEBUG nova.objects.instance [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.203 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:26.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/868074923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.658 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:26 np0005588919 nova_compute[225855]: 2026-01-20 14:28:26.694 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:26.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2394368548' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:27 np0005588919 nova_compute[225855]: 2026-01-20 14:28:27.116 225859 DEBUG oslo_concurrency.processutils [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:27 np0005588919 nova_compute[225855]: 2026-01-20 14:28:27.119 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <uuid>29f0b4d4-abf0-46e7-bf67-38e71eb42e28</uuid>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <name>instance-00000016</name>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:name>tempest-MigrationsAdminTest-server-920976466</nova:name>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:28:26</nova:creationTime>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="serial">29f0b4d4-abf0-46e7-bf67-38e71eb42e28</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="uuid">29f0b4d4-abf0-46e7-bf67-38e71eb42e28</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_disk.config">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28/console.log" append="off"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:28:27 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:28:27 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:28:27 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:28:27 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:28:27 np0005588919 nova_compute[225855]: 2026-01-20 14:28:27.250 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:27 np0005588919 nova_compute[225855]: 2026-01-20 14:28:27.250 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:27 np0005588919 nova_compute[225855]: 2026-01-20 14:28:27.251 225859 INFO nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Using config drive#033[00m
Jan 20 09:28:27 np0005588919 systemd-machined[194361]: New machine qemu-10-instance-00000016.
Jan 20 09:28:27 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:28:27 np0005588919 systemd[1]: Started Virtual Machine qemu-10-instance-00000016.
Jan 20 09:28:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:28.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.412 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919308.4120557, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.413 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.416 225859 DEBUG nova.compute.manager [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.420 225859 INFO nova.virt.libvirt.driver [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance running successfully.#033[00m
Jan 20 09:28:28 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.423 225859 DEBUG nova.virt.libvirt.guest [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.423 225859 DEBUG nova.virt.libvirt.driver [None req-2a504ad3-9dad-4a29-b7b7-86dde0c37d43 d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.481 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.486 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.490 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.535 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.535 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919308.413144, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.536 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Started (Lifecycle Event)#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.559 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:28 np0005588919 nova_compute[225855]: 2026-01-20 14:28:28.564 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:28.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:30.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:30 np0005588919 nova_compute[225855]: 2026-01-20 14:28:30.498 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Creating tmpfile /var/lib/nova/instances/tmpt3smbf1a to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 20 09:28:30 np0005588919 nova_compute[225855]: 2026-01-20 14:28:30.499 225859 DEBUG nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt3smbf1a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 20 09:28:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 20 09:28:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:30.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:31 np0005588919 nova_compute[225855]: 2026-01-20 14:28:31.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:31 np0005588919 podman[237179]: 2026-01-20 14:28:31.106423539 +0000 UTC m=+0.134284096 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:28:32 np0005588919 nova_compute[225855]: 2026-01-20 14:28:32.004 225859 DEBUG nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt3smbf1a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 20 09:28:32 np0005588919 nova_compute[225855]: 2026-01-20 14:28:32.031 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:32 np0005588919 nova_compute[225855]: 2026-01-20 14:28:32.032 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquired lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:32 np0005588919 nova_compute[225855]: 2026-01-20 14:28:32.032 225859 DEBUG nova.network.neutron [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:32.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:32.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 20 09:28:33 np0005588919 nova_compute[225855]: 2026-01-20 14:28:33.485 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:33 np0005588919 nova_compute[225855]: 2026-01-20 14:28:33.962 225859 DEBUG nova.network.neutron [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Updating instance_info_cache with network_info: [{"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.010 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Releasing lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.013 225859 DEBUG os_brick.utils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.014 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.032 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.033 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe20fb8-3ba3-4214-b0dd-0aa3d6eb0cde]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.035 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.047 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.047 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[2d90f998-6a7d-40bb-aec6-1d8d05dd9b97]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.050 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.065 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.065 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd1411a-a593-4dce-a931-ff32f81453e0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.067 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[1941289a-db24-48c4-8a46-04ef1da0976b]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.068 225859 DEBUG oslo_concurrency.processutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.108 225859 DEBUG oslo_concurrency.processutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] CMD "nvme version" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.111 225859 DEBUG os_brick.initiator.connectors.lightos [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.112 225859 DEBUG os_brick.initiator.connectors.lightos [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.112 225859 DEBUG os_brick.initiator.connectors.lightos [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:28:34 np0005588919 nova_compute[225855]: 2026-01-20 14:28:34.113 225859 DEBUG os_brick.utils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] <== get_connector_properties: return (99ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:28:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:34.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:34.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.392 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt3smbf1a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={47e883f3-6efe-40b3-be28-6c01525dfc0c='9eb63166-9838-4b2e-9a3b-635bb42864f1'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.394 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Creating instance directory: /var/lib/nova/instances/79b5596e-43c9-4085-9829-454fecf59490 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.395 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Ensure instance console log exists: /var/lib/nova/instances/79b5596e-43c9-4085-9829-454fecf59490/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.395 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.399 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.401 225859 DEBUG nova.virt.libvirt.vif [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:28:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1483268234',display_name='tempest-LiveMigrationTest-server-1483268234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1483268234',id=23,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:28:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d15f60b9e48e4175b5520d1e57ed2d3a',ramdisk_id='',reservation_id='r-jglb1q09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-864280704',owner_user_name='tempest-LiveMigrationTest-864280704-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:28:24Z,user_data=None,user_id='bce7fcbd19554e29bb80c5b93b7dd3c9',uuid=79b5596e-43c9-4085-9829-454fecf59490,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.401 225859 DEBUG nova.network.os_vif_util [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converting VIF {"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.403 225859 DEBUG nova.network.os_vif_util [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.404 225859 DEBUG os_vif [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.405 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.405 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.407 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.411 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd002580-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.412 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbd002580-dd, col_values=(('external_ids', {'iface-id': 'bd002580-dd95-49e1-bc34-e85f86272a05', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:ce:0d', 'vm-uuid': '79b5596e-43c9-4085-9829-454fecf59490'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.414 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:35 np0005588919 NetworkManager[49104]: <info>  [1768919315.4152] manager: (tapbd002580-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.421 225859 INFO os_vif [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd')#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.426 225859 DEBUG nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 20 09:28:35 np0005588919 nova_compute[225855]: 2026-01-20 14:28:35.427 225859 DEBUG nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt3smbf1a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={47e883f3-6efe-40b3-be28-6c01525dfc0c='9eb63166-9838-4b2e-9a3b-635bb42864f1'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 20 09:28:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:36.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:36.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:37 np0005588919 nova_compute[225855]: 2026-01-20 14:28:37.452 225859 DEBUG nova.network.neutron [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Port bd002580-dd95-49e1-bc34-e85f86272a05 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 20 09:28:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:38 np0005588919 nova_compute[225855]: 2026-01-20 14:28:38.520 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:38 np0005588919 nova_compute[225855]: 2026-01-20 14:28:38.604 225859 DEBUG nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt3smbf1a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={47e883f3-6efe-40b3-be28-6c01525dfc0c='9eb63166-9838-4b2e-9a3b-635bb42864f1'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 20 09:28:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:38.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:39 np0005588919 kernel: tapbd002580-dd: entered promiscuous mode
Jan 20 09:28:39 np0005588919 NetworkManager[49104]: <info>  [1768919319.1852] manager: (tapbd002580-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:39Z|00092|binding|INFO|Claiming lport bd002580-dd95-49e1-bc34-e85f86272a05 for this additional chassis.
Jan 20 09:28:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:39Z|00093|binding|INFO|bd002580-dd95-49e1-bc34-e85f86272a05: Claiming fa:16:3e:24:ce:0d 10.100.0.10
Jan 20 09:28:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:39Z|00094|binding|INFO|Setting lport bd002580-dd95-49e1-bc34-e85f86272a05 ovn-installed in OVS
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.209 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:39 np0005588919 systemd-udevd[237280]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:28:39 np0005588919 NetworkManager[49104]: <info>  [1768919319.2322] device (tapbd002580-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:28:39 np0005588919 NetworkManager[49104]: <info>  [1768919319.2330] device (tapbd002580-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:28:39 np0005588919 systemd-machined[194361]: New machine qemu-11-instance-00000017.
Jan 20 09:28:39 np0005588919 systemd[1]: Started Virtual Machine qemu-11-instance-00000017.
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.749 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.751 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.774 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.876 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.877 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.885 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.885 225859 INFO nova.compute.claims [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:28:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:39.892 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:28:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:39.892 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:28:39 np0005588919 nova_compute[225855]: 2026-01-20 14:28:39.922 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.076 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.173 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919320.1727831, 79b5596e-43c9-4085-9829-454fecf59490 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.174 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] VM Started (Lifecycle Event)#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.195 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:40.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.414 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:28:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2708764190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.523 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.527 225859 DEBUG nova.compute.provider_tree [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.550 225859 DEBUG nova.scheduler.client.report [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.584 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.585 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.646 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.648 225859 DEBUG nova.network.neutron [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.670 225859 INFO nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.687 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.711 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919320.711011, 79b5596e-43c9-4085-9829-454fecf59490 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.711 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.749 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.752 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.786 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.805 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.806 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.807 225859 INFO nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Creating image(s)#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.830 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.860 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.889 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.893 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:40.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.970 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.972 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.973 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:40 np0005588919 nova_compute[225855]: 2026-01-20 14:28:40.973 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.006 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.010 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 87fe16d6-774e-4002-8df4-9eb202621ab9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.041 225859 DEBUG nova.network.neutron [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.042 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:28:41 np0005588919 podman[237412]: 2026-01-20 14:28:41.094566716 +0000 UTC m=+0.129374718 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.293 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 87fe16d6-774e-4002-8df4-9eb202621ab9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.394 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] resizing rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.547 225859 DEBUG nova.objects.instance [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.565 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.565 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Ensure instance console log exists: /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.566 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.567 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.567 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.571 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.579 225859 WARNING nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.585 225859 DEBUG nova.virt.libvirt.host [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.587 225859 DEBUG nova.virt.libvirt.host [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.590 225859 DEBUG nova.virt.libvirt.host [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.590 225859 DEBUG nova.virt.libvirt.host [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.591 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.592 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.592 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.592 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.593 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.593 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.593 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.593 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.593 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.594 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.594 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.594 225859 DEBUG nova.virt.hardware [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:28:41 np0005588919 nova_compute[225855]: 2026-01-20 14:28:41.597 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2946817491' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.093 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.129 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.134 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:42.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3456057515' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.595 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.597 225859 DEBUG nova.objects.instance [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.611 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <uuid>87fe16d6-774e-4002-8df4-9eb202621ab9</uuid>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <name>instance-00000018</name>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:name>tempest-MigrationsAdminTest-server-724945079</nova:name>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:28:41</nova:creationTime>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="serial">87fe16d6-774e-4002-8df4-9eb202621ab9</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="uuid">87fe16d6-774e-4002-8df4-9eb202621ab9</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/87fe16d6-774e-4002-8df4-9eb202621ab9_disk">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/console.log" append="off"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:28:42 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:28:42 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:28:42 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:28:42 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.660 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.660 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.661 225859 INFO nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Using config drive#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.687 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:42.894 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.899 225859 INFO nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Creating config drive at /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config#033[00m
Jan 20 09:28:42 np0005588919 nova_compute[225855]: 2026-01-20 14:28:42.904 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7imlze8h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:42.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.049 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7imlze8h" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:43Z|00095|binding|INFO|Claiming lport bd002580-dd95-49e1-bc34-e85f86272a05 for this chassis.
Jan 20 09:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:43Z|00096|binding|INFO|bd002580-dd95-49e1-bc34-e85f86272a05: Claiming fa:16:3e:24:ce:0d 10.100.0.10
Jan 20 09:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:28:43Z|00097|binding|INFO|Setting lport bd002580-dd95-49e1-bc34-e85f86272a05 up in Southbound
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.092 225859 DEBUG nova.storage.rbd_utils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image 87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.095 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:ce:0d 10.100.0.10'], port_security=['fa:16:3e:24:ce:0d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '79b5596e-43c9-4085-9829-454fecf59490', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14f18b27-1594-48d8-a08b-a930f7adbc08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02983c41-bbec-48cf-910a-84fed1be783f, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=bd002580-dd95-49e1-bc34-e85f86272a05) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.097 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config 87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.097 140354 INFO neutron.agent.ovn.metadata.agent [-] Port bd002580-dd95-49e1-bc34-e85f86272a05 in datapath 14f18b27-1594-48d8-a08b-a930f7adbc08 bound to our chassis#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.100 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14f18b27-1594-48d8-a08b-a930f7adbc08#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.118 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2f87bc06-571b-4637-9ceb-cf247adee100]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.152 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb0a8f7-2e90-484d-93a9-fafc78090b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.158 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[35ee62dd-b38f-4eda-a656-416ad73998e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.198 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[028c4a95-7124-44ee-8f53-608a20fa34cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.220 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[59b7fdee-13ae-43fa-88ed-17d771ba4721]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14f18b27-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:1f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 25, 'tx_packets': 5, 'rx_bytes': 1752, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 3, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 25, 'tx_packets': 5, 'rx_bytes': 1752, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 3, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430599, 'reachable_time': 25641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 14, 'inoctets': 1040, 'indelivers': 5, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 14, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1040, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 14, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 5, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237666, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.243 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a14e43a0-b8a6-4309-9791-7d0d7babffc0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap14f18b27-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430610, 'tstamp': 430610}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237670, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap14f18b27-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430612, 'tstamp': 430612}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237670, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.244 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f18b27-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.246 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.247 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14f18b27-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.248 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.248 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14f18b27-10, col_values=(('external_ids', {'iface-id': 'aa1c73c5-9761-4457-acdc-9f93220f739f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:28:43.249 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.273 225859 DEBUG oslo_concurrency.processutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config 87fe16d6-774e-4002-8df4-9eb202621ab9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.274 225859 INFO nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Deleting local config drive /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/disk.config because it was imported into RBD.#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.328 225859 INFO nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Post operation of migration started#033[00m
Jan 20 09:28:43 np0005588919 systemd-machined[194361]: New machine qemu-12-instance-00000018.
Jan 20 09:28:43 np0005588919 systemd[1]: Started Virtual Machine qemu-12-instance-00000018.
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.561 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.802 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919323.8021781, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.803 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.806 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.807 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.811 225859 INFO nova.virt.libvirt.driver [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance spawned successfully.#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.811 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.837 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.846 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.851 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.852 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.853 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.853 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.854 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.855 225859 DEBUG nova.virt.libvirt.driver [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.897 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.897 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919323.8032815, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.898 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Started (Lifecycle Event)#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.924 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.928 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.932 225859 INFO nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Took 3.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.932 225859 DEBUG nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.934 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.934 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquired lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.934 225859 DEBUG nova.network.neutron [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:43 np0005588919 nova_compute[225855]: 2026-01-20 14:28:43.949 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:28:44 np0005588919 nova_compute[225855]: 2026-01-20 14:28:44.007 225859 INFO nova.compute.manager [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Took 4.17 seconds to build instance.#033[00m
Jan 20 09:28:44 np0005588919 nova_compute[225855]: 2026-01-20 14:28:44.027 225859 DEBUG oslo_concurrency.lockutils [None req-f779cab0-8c82-4d01-896e-0ee4904b5f0b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:44.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:44.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.418 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.818 225859 DEBUG nova.network.neutron [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Updating instance_info_cache with network_info: [{"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.840 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Releasing lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.860 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.860 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.861 225859 DEBUG oslo_concurrency.lockutils [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:45 np0005588919 nova_compute[225855]: 2026-01-20 14:28:45.870 225859 INFO nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 20 09:28:45 np0005588919 virtqemud[225396]: Domain id=11 name='instance-00000017' uuid=79b5596e-43c9-4085-9829-454fecf59490 is tainted: custom-monitor
Jan 20 09:28:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:46.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:46 np0005588919 nova_compute[225855]: 2026-01-20 14:28:46.882 225859 INFO nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 20 09:28:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:46.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:47 np0005588919 nova_compute[225855]: 2026-01-20 14:28:47.888 225859 INFO nova.virt.libvirt.driver [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 20 09:28:47 np0005588919 nova_compute[225855]: 2026-01-20 14:28:47.894 225859 DEBUG nova.compute.manager [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:28:47 np0005588919 nova_compute[225855]: 2026-01-20 14:28:47.923 225859 DEBUG nova.objects.instance [None req-316b633b-1626-487f-b686-649c0e42886e f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:28:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:48.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.392 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.393 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.393 225859 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.618 225859 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.888 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:28:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:48.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.932 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid d726266f-b9a6-406b-ad13-f9db3e0dc6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.933 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.934 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid 79b5596e-43c9-4085-9829-454fecf59490 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.934 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.935 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.935 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.936 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.937 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.938 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.938 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "79b5596e-43c9-4085-9829-454fecf59490" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.939 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.940 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.941 225859 INFO nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] During sync_power_state the instance has a pending task (resize_prep). Skip.#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.942 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.994 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:48 np0005588919 nova_compute[225855]: 2026-01-20 14:28:48.995 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.039 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "79b5596e-43c9-4085-9829-454fecf59490" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.305 225859 DEBUG nova.network.neutron [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.340 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.463 225859 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.464 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Creating file /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/4d3b1e03b92e457094cc45cb59666465.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 09:28:49 np0005588919 nova_compute[225855]: 2026-01-20 14:28:49.465 225859 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/4d3b1e03b92e457094cc45cb59666465.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.008 225859 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/4d3b1e03b92e457094cc45cb59666465.tmp" returned: 1 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.010 225859 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9/4d3b1e03b92e457094cc45cb59666465.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.010 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Creating directory /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.011 225859 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.260 225859 DEBUG oslo_concurrency.processutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/87fe16d6-774e-4002-8df4-9eb202621ab9" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.265 225859 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:28:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:50.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:50 np0005588919 nova_compute[225855]: 2026-01-20 14:28:50.421 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:50.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:28:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:52.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:28:52 np0005588919 nova_compute[225855]: 2026-01-20 14:28:52.816 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Check if temp file /var/lib/nova/instances/tmp27yeses5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 20 09:28:52 np0005588919 nova_compute[225855]: 2026-01-20 14:28:52.817 225859 DEBUG nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp27yeses5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 20 09:28:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:52.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:53 np0005588919 nova_compute[225855]: 2026-01-20 14:28:53.646 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:54.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:54.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:55 np0005588919 nova_compute[225855]: 2026-01-20 14:28:55.425 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 20 09:28:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:56.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 20 09:28:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:28:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:56.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:28:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:28:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/731545174' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:28:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:28:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.684 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:28:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:28:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:28:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:28:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:28:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:28:58.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.985 225859 DEBUG nova.compute.manager [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.986 225859 DEBUG oslo_concurrency.lockutils [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.986 225859 DEBUG oslo_concurrency.lockutils [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.986 225859 DEBUG oslo_concurrency.lockutils [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.987 225859 DEBUG nova.compute.manager [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:28:58 np0005588919 nova_compute[225855]: 2026-01-20 14:28:58.987 225859 DEBUG nova.compute.manager [req-abef9692-ee67-4217-ac01-bdceccb1f562 req-d4ce792c-0a42-49c3-92be-8893d9e5fbbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.724 225859 INFO nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Took 5.68 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.726 225859 DEBUG nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:28:59 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.755 225859 DEBUG nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp27yeses5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='79b5596e-43c9-4085-9829-454fecf59490',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(522850ff-68d3-4cab-8c83-50b3a540cdfa),old_vol_attachment_ids={47e883f3-6efe-40b3-be28-6c01525dfc0c='636146ab-4bc6-4c21-9609-7755eb208c7c'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.759 225859 DEBUG nova.objects.instance [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lazy-loading 'migration_context' on Instance uuid 79b5596e-43c9-4085-9829-454fecf59490 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.761 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 20 09:28:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.764 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.764 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.792 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Find same serial number: pos=1, serial=47e883f3-6efe-40b3-be28-6c01525dfc0c _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.795 225859 DEBUG nova.virt.libvirt.vif [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:28:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1483268234',display_name='tempest-LiveMigrationTest-server-1483268234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1483268234',id=23,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:28:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d15f60b9e48e4175b5520d1e57ed2d3a',ramdisk_id='',reservation_id='r-jglb1q09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-864280704',owner_user_name='tempest-LiveMigrationTest-864280704-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:28:47Z,user_data=None,user_id='bce7fcbd19554e29bb80c5b93b7dd3c9',uuid=79b5596e-43c9-4085-9829-454fecf59490,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.796 225859 DEBUG nova.network.os_vif_util [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converting VIF {"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.797 225859 DEBUG nova.network.os_vif_util [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.798 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Updating guest XML with vif config: <interface type="ethernet">
Jan 20 09:28:59 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:24:ce:0d"/>
Jan 20 09:28:59 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:28:59 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:28:59 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:28:59 np0005588919 nova_compute[225855]:  <target dev="tapbd002580-dd"/>
Jan 20 09:28:59 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:28:59 np0005588919 nova_compute[225855]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 20 09:28:59 np0005588919 nova_compute[225855]: 2026-01-20 14:28:59.799 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.267 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.268 225859 INFO nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 20 09:29:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.313 225859 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.426 225859 INFO nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.428 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.929 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 09:29:00 np0005588919 nova_compute[225855]: 2026-01-20 14:29:00.930 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 20 09:29:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:00.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.082 225859 DEBUG nova.compute.manager [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.082 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.082 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.083 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.083 225859 DEBUG nova.compute.manager [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.083 225859 WARNING nova.compute.manager [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received unexpected event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with vm_state active and task_state migrating.#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.083 225859 DEBUG nova.compute.manager [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-changed-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.084 225859 DEBUG nova.compute.manager [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Refreshing instance network info cache due to event network-changed-bd002580-dd95-49e1-bc34-e85f86272a05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.084 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.084 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.085 225859 DEBUG nova.network.neutron [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Refreshing network info cache for port bd002580-dd95-49e1-bc34-e85f86272a05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.433 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.436 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.812 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919341.8116124, 79b5596e-43c9-4085-9829-454fecf59490 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.813 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.833 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.840 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.863 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.941 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Current 50 elapsed 2 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 09:29:01 np0005588919 nova_compute[225855]: 2026-01-20 14:29:01.942 225859 DEBUG nova.virt.libvirt.migration [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 20 09:29:01 np0005588919 kernel: tapbd002580-dd (unregistering): left promiscuous mode
Jan 20 09:29:02 np0005588919 NetworkManager[49104]: <info>  [1768919342.0100] device (tapbd002580-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:02 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:02Z|00098|binding|INFO|Releasing lport bd002580-dd95-49e1-bc34-e85f86272a05 from this chassis (sb_readonly=0)
Jan 20 09:29:02 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:02Z|00099|binding|INFO|Setting lport bd002580-dd95-49e1-bc34-e85f86272a05 down in Southbound
Jan 20 09:29:02 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:02Z|00100|binding|INFO|Removing iface tapbd002580-dd ovn-installed in OVS
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.035 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:ce:0d 10.100.0.10'], port_security=['fa:16:3e:24:ce:0d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '367c1a2c-b16a-4828-ab5a-626bb50023b4'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '79b5596e-43c9-4085-9829-454fecf59490', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14f18b27-1594-48d8-a08b-a930f7adbc08', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '18', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02983c41-bbec-48cf-910a-84fed1be783f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=bd002580-dd95-49e1-bc34-e85f86272a05) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.038 140354 INFO neutron.agent.ovn.metadata.agent [-] Port bd002580-dd95-49e1-bc34-e85f86272a05 in datapath 14f18b27-1594-48d8-a08b-a930f7adbc08 unbound from our chassis#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.040 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14f18b27-1594-48d8-a08b-a930f7adbc08#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.052 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.064 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e159633d-5608-4aef-9fdd-2a5b45cb1f1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 20 09:29:02 np0005588919 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000017.scope: Consumed 2.761s CPU time.
Jan 20 09:29:02 np0005588919 systemd-machined[194361]: Machine qemu-11-instance-00000017 terminated.
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.108 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a2758760-3225-49fe-9a31-9f4bc01c8497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.111 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cf938cea-5f6c-462f-983a-b84f1d793cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 podman[238045]: 2026-01-20 14:29:02.123605967 +0000 UTC m=+0.151404868 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.141 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[abad8b4b-d929-4ea6-8d60-1bd313c93011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 virtqemud[225396]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-47e883f3-6efe-40b3-be28-6c01525dfc0c: No such file or directory
Jan 20 09:29:02 np0005588919 virtqemud[225396]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-47e883f3-6efe-40b3-be28-6c01525dfc0c: No such file or directory
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.161 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[686ced00-f413-4407-a1f4-b0c956035cea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14f18b27-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7d:1f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 2466, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 3, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 42, 'tx_packets': 7, 'rx_bytes': 2466, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 3, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430599, 'reachable_time': 25641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 14, 'inoctets': 1040, 'indelivers': 5, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 14, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1040, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 14, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 5, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238081, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.175 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.175 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.176 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ee257a25-eb13-4a52-af49-73760ecea4bf]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap14f18b27-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430610, 'tstamp': 430610}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238087, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap14f18b27-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430612, 'tstamp': 430612}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238087, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.181 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f18b27-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.217 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.218 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14f18b27-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.218 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.219 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14f18b27-10, col_values=(('external_ids', {'iface-id': 'aa1c73c5-9761-4457-acdc-9f93220f739f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:02.219 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:29:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:02.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.321 225859 DEBUG nova.compute.manager [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.322 225859 DEBUG oslo_concurrency.lockutils [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.322 225859 DEBUG oslo_concurrency.lockutils [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.322 225859 DEBUG oslo_concurrency.lockutils [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.323 225859 DEBUG nova.compute.manager [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.323 225859 DEBUG nova.compute.manager [req-a8c41029-e8f0-4ebd-a84f-b4655a531645 req-9c5a2d83-f75a-4349-8304-114fe71972bd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.445 225859 DEBUG nova.virt.libvirt.guest [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '79b5596e-43c9-4085-9829-454fecf59490' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.445 225859 INFO nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migration operation has completed#033[00m
Jan 20 09:29:02 np0005588919 nova_compute[225855]: 2026-01-20 14:29:02.446 225859 INFO nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] _post_live_migration() is started..#033[00m
Jan 20 09:29:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:29:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:29:02 np0005588919 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 20 09:29:02 np0005588919 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Consumed 15.206s CPU time.
Jan 20 09:29:02 np0005588919 systemd-machined[194361]: Machine qemu-12-instance-00000018 terminated.
Jan 20 09:29:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:02.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.330 225859 INFO nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.338 225859 INFO nova.virt.libvirt.driver [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance destroyed successfully.#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.343 225859 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.344 225859 DEBUG nova.virt.libvirt.driver [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] skipping disk for instance-00000018 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.375 225859 DEBUG nova.network.neutron [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Updated VIF entry in instance network info cache for port bd002580-dd95-49e1-bc34-e85f86272a05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.376 225859 DEBUG nova.network.neutron [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Updating instance_info_cache with network_info: [{"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.429 225859 DEBUG oslo_concurrency.lockutils [req-54663048-335f-4fa6-ad1a-3950f7e92c71 req-8eb0251f-2127-42e9-98a3-61941c4c86ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-79b5596e-43c9-4085-9829-454fecf59490" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.500 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.501 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.501 225859 DEBUG oslo_concurrency.lockutils [None req-21751958-ead6-449d-8a04-ee82f802da6d 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:03 np0005588919 nova_compute[225855]: 2026-01-20 14:29:03.686 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:04.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.500 225859 DEBUG nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.500 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.501 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.502 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.502 225859 DEBUG nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.503 225859 WARNING nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received unexpected event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with vm_state active and task_state migrating.#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.503 225859 DEBUG nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.504 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.504 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.504 225859 DEBUG oslo_concurrency.lockutils [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.505 225859 DEBUG nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.505 225859 WARNING nova.compute.manager [req-f379ab50-2398-4aeb-a00d-d95717a1a867 req-289af249-87b5-4fc0-9118-2d2a68bd7201 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received unexpected event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with vm_state active and task_state migrating.#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.692 225859 DEBUG nova.compute.manager [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.693 225859 DEBUG oslo_concurrency.lockutils [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.693 225859 DEBUG oslo_concurrency.lockutils [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.693 225859 DEBUG oslo_concurrency.lockutils [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.694 225859 DEBUG nova.compute.manager [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.694 225859 DEBUG nova.compute.manager [req-2d43485c-63ac-4e27-8096-eb6987ead2c2 req-8ee442b6-d76e-4b99-842e-ba8d53557c71 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-unplugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:29:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:04.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.994 225859 DEBUG nova.network.neutron [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Activated binding for port bd002580-dd95-49e1-bc34-e85f86272a05 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.995 225859 DEBUG nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.997 225859 DEBUG nova.virt.libvirt.vif [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:28:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1483268234',display_name='tempest-LiveMigrationTest-server-1483268234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1483268234',id=23,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:28:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d15f60b9e48e4175b5520d1e57ed2d3a',ramdisk_id='',reservation_id='r-jglb1q09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-864280704',owner_user_name='tempest-LiveMigrationTest-864280704-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:28:51Z,user_data=None,user_id='bce7fcbd19554e29bb80c5b93b7dd3c9',uuid=79b5596e-43c9-4085-9829-454fecf59490,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.997 225859 DEBUG nova.network.os_vif_util [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converting VIF {"id": "bd002580-dd95-49e1-bc34-e85f86272a05", "address": "fa:16:3e:24:ce:0d", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd002580-dd", "ovs_interfaceid": "bd002580-dd95-49e1-bc34-e85f86272a05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:29:04 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.999 225859 DEBUG nova.network.os_vif_util [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:04.999 225859 DEBUG os_vif [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.002 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.002 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd002580-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.010 225859 INFO os_vif [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:ce:0d,bridge_name='br-int',has_traffic_filtering=True,id=bd002580-dd95-49e1-bc34-e85f86272a05,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd002580-dd')#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.010 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.011 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.011 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.012 225859 DEBUG nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.013 225859 INFO nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Deleting instance files /var/lib/nova/instances/79b5596e-43c9-4085-9829-454fecf59490_del#033[00m
Jan 20 09:29:05 np0005588919 nova_compute[225855]: 2026-01-20 14:29:05.013 225859 INFO nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Deletion of /var/lib/nova/instances/79b5596e-43c9-4085-9829-454fecf59490_del complete#033[00m
Jan 20 09:29:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 20 09:29:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:06.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.591 225859 DEBUG nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.591 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.592 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.592 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.592 225859 DEBUG nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.593 225859 WARNING nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received unexpected event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with vm_state active and task_state migrating.#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.593 225859 DEBUG nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.593 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.593 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.594 225859 DEBUG oslo_concurrency.lockutils [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.594 225859 DEBUG nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] No waiting events found dispatching network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:06 np0005588919 nova_compute[225855]: 2026-01-20 14:29:06.594 225859 WARNING nova.compute.manager [req-82ca5b27-87b8-4179-b3d9-a7fab695433e req-8378bd0e-ee95-41dd-915b-88d906712153 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Received unexpected event network-vif-plugged-bd002580-dd95-49e1-bc34-e85f86272a05 for instance with vm_state active and task_state migrating.#033[00m
Jan 20 09:29:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:06.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:08.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:29:08 np0005588919 nova_compute[225855]: 2026-01-20 14:29:08.732 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:08 np0005588919 nova_compute[225855]: 2026-01-20 14:29:08.919 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "87fe16d6-774e-4002-8df4-9eb202621ab9" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:08 np0005588919 nova_compute[225855]: 2026-01-20 14:29:08.920 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:08 np0005588919 nova_compute[225855]: 2026-01-20 14:29:08.920 225859 DEBUG nova.compute.manager [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Going to confirm migration 7 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 09:29:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:08.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.117 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.117 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.118 225859 DEBUG nova.network.neutron [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.118 225859 DEBUG nova.objects.instance [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'info_cache' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.277 225859 DEBUG nova.network.neutron [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.540 225859 DEBUG nova.network.neutron [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.557 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-87fe16d6-774e-4002-8df4-9eb202621ab9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.558 225859 DEBUG nova.objects.instance [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid 87fe16d6-774e-4002-8df4-9eb202621ab9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:09 np0005588919 nova_compute[225855]: 2026-01-20 14:29:09.681 225859 DEBUG nova.storage.rbd_utils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] removing snapshot(nova-resize) on rbd image(87fe16d6-774e-4002-8df4-9eb202621ab9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:29:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.041 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:10.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.686 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.686 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.805 225859 DEBUG oslo_concurrency.processutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.836 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "79b5596e-43c9-4085-9829-454fecf59490-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.837 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.837 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "79b5596e-43c9-4085-9829-454fecf59490-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:10 np0005588919 nova_compute[225855]: 2026-01-20 14:29:10.861 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/107033582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.266 225859 DEBUG oslo_concurrency.processutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.275 225859 DEBUG nova.compute.provider_tree [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.293 225859 DEBUG nova.scheduler.client.report [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.341 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.346 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.347 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.347 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.348 225859 DEBUG oslo_concurrency.processutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.489 225859 INFO nova.scheduler.client.report [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Deleted allocation for migration 8dc09b06-46b2-4315-8857-eb43dfbe98ff#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.573 225859 DEBUG oslo_concurrency.lockutils [None req-5094afda-e7db-467d-b2b0-a6e5c26bf603 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "87fe16d6-774e-4002-8df4-9eb202621ab9" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/549940775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.772 225859 DEBUG oslo_concurrency.processutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.872 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.873 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.877 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:11 np0005588919 nova_compute[225855]: 2026-01-20 14:29:11.878 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:11 np0005588919 podman[238234]: 2026-01-20 14:29:11.920892107 +0000 UTC m=+0.089088155 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.072 225859 WARNING nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.073 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4382MB free_disk=20.845653533935547GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.073 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.074 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.125 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Migration for instance 79b5596e-43c9-4085-9829-454fecf59490 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.146 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.168 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Instance d726266f-b9a6-406b-ad13-f9db3e0dc6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.169 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.169 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Migration 522850ff-68d3-4cab-8c83-50b3a540cdfa is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.169 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.169 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.285 225859 DEBUG oslo_concurrency.processutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:12.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.394 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1240338787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.784 225859 DEBUG oslo_concurrency.processutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.792 225859 DEBUG nova.compute.provider_tree [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.825 225859 DEBUG nova.scheduler.client.report [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.856 225859 DEBUG nova.compute.resource_tracker [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.856 225859 DEBUG oslo_concurrency.lockutils [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:12 np0005588919 nova_compute[225855]: 2026-01-20 14:29:12.862 225859 INFO nova.compute.manager [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 20 09:29:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:12.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:13 np0005588919 nova_compute[225855]: 2026-01-20 14:29:13.000 225859 INFO nova.scheduler.client.report [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] Deleted allocation for migration 522850ff-68d3-4cab-8c83-50b3a540cdfa#033[00m
Jan 20 09:29:13 np0005588919 nova_compute[225855]: 2026-01-20 14:29:13.001 225859 DEBUG nova.virt.libvirt.driver [None req-3e6ecc77-61de-4099-8bb3-ee2d276c7579 f59120b8f4004c4fb57448db9dcaa6cd e22b29df381845278c7b679b17d11c8b - - default default] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 20 09:29:13 np0005588919 nova_compute[225855]: 2026-01-20 14:29:13.735 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:14.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.699 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Acquiring lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.699 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.700 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Acquiring lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.700 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.700 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.702 225859 INFO nova.compute.manager [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Terminating instance#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.703 225859 DEBUG nova.compute.manager [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:29:14 np0005588919 kernel: tape6067076-0f (unregistering): left promiscuous mode
Jan 20 09:29:14 np0005588919 NetworkManager[49104]: <info>  [1768919354.7537] device (tape6067076-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00101|binding|INFO|Releasing lport e6067076-0f97-4e9c-9355-353277570e11 from this chassis (sb_readonly=0)
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00102|binding|INFO|Setting lport e6067076-0f97-4e9c-9355-353277570e11 down in Southbound
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00103|binding|INFO|Releasing lport 9013ed66-b0f2-4a83-b7d4-572f1324f582 from this chassis (sb_readonly=0)
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00104|binding|INFO|Setting lport 9013ed66-b0f2-4a83-b7d4-572f1324f582 down in Southbound
Jan 20 09:29:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00105|binding|INFO|Removing iface tape6067076-0f ovn-installed in OVS
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.768 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00106|binding|INFO|Releasing lport aa1c73c5-9761-4457-acdc-9f93220f739f from this chassis (sb_readonly=0)
Jan 20 09:29:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:29:14Z|00107|binding|INFO|Releasing lport e10f34be-dfc1-4bfe-806f-f00a84c17390 from this chassis (sb_readonly=0)
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.772 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:74:79 19.80.0.125'], port_security=['fa:16:3e:51:74:79 19.80.0.125'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['e6067076-0f97-4e9c-9355-353277570e11'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1871336558', 'neutron:cidrs': '19.80.0.125/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08e625c5-899c-442a-8ef4-9a3c96892de4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1871336558', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=62d5dc3b-a6a9-4e55-8632-5a7fe1112862, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9013ed66-b0f2-4a83-b7d4-572f1324f582) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.773 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:cf:b7 10.100.0.12'], port_security=['fa:16:3e:db:cf:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-395006048', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd726266f-b9a6-406b-ad13-f9db3e0dc6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14f18b27-1594-48d8-a08b-a930f7adbc08', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-395006048', 'neutron:project_id': 'd15f60b9e48e4175b5520d1e57ed2d3a', 'neutron:revision_number': '14', 'neutron:security_group_ids': '6d729cfd-2f98-4ca5-a524-e543b12b3766', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02983c41-bbec-48cf-910a-84fed1be783f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e6067076-0f97-4e9c-9355-353277570e11) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.774 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9013ed66-b0f2-4a83-b7d4-572f1324f582 in datapath 08e625c5-899c-442a-8ef4-9a3c96892de4 unbound from our chassis#033[00m
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.775 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08e625c5-899c-442a-8ef4-9a3c96892de4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.776 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e01054d3-ec00-4e06-975b-33c22d54b2b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:14.776 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4 namespace which is not needed anymore#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.795 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 20 09:29:14 np0005588919 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Consumed 6.962s CPU time.
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.868 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 systemd-machined[194361]: Machine qemu-9-instance-00000010 terminated.
Jan 20 09:29:14 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [NOTICE]   (235372) : haproxy version is 2.8.14-c23fe91
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.947 225859 INFO nova.virt.libvirt.driver [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Instance destroyed successfully.#033[00m
Jan 20 09:29:14 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [NOTICE]   (235372) : path to executable is /usr/sbin/haproxy
Jan 20 09:29:14 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [WARNING]  (235372) : Exiting Master process...
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.947 225859 DEBUG nova.objects.instance [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lazy-loading 'resources' on Instance uuid d726266f-b9a6-406b-ad13-f9db3e0dc6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:14 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [ALERT]    (235372) : Current worker (235374) exited with code 143 (Terminated)
Jan 20 09:29:14 np0005588919 neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4[235367]: [WARNING]  (235372) : All workers exited. Exiting... (0)
Jan 20 09:29:14 np0005588919 systemd[1]: libpod-ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395.scope: Deactivated successfully.
Jan 20 09:29:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:14 np0005588919 podman[238303]: 2026-01-20 14:29:14.971037219 +0000 UTC m=+0.081078471 container died ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.972 225859 DEBUG nova.virt.libvirt.vif [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:27:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1394818615',display_name='tempest-LiveMigrationTest-server-1394818615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1394818615',id=16,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:27:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d15f60b9e48e4175b5520d1e57ed2d3a',ramdisk_id='',reservation_id='r-pti072hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-864280704',owner_user_name='tempest-LiveMigrationTest-864280704-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:27:52Z,user_data=None,user_id='bce7fcbd19554e29bb80c5b93b7dd3c9',uuid=d726266f-b9a6-406b-ad13-f9db3e0dc6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.973 225859 DEBUG nova.network.os_vif_util [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Converting VIF {"id": "e6067076-0f97-4e9c-9355-353277570e11", "address": "fa:16:3e:db:cf:b7", "network": {"id": "14f18b27-1594-48d8-a08b-a930f7adbc08", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2126108622-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d15f60b9e48e4175b5520d1e57ed2d3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6067076-0f", "ovs_interfaceid": "e6067076-0f97-4e9c-9355-353277570e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.973 225859 DEBUG nova.network.os_vif_util [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.974 225859 DEBUG os_vif [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.976 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.976 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6067076-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:29:14 np0005588919 nova_compute[225855]: 2026-01-20 14:29:14.983 225859 INFO os_vif [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:cf:b7,bridge_name='br-int',has_traffic_filtering=True,id=e6067076-0f97-4e9c-9355-353277570e11,network=Network(14f18b27-1594-48d8-a08b-a930f7adbc08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape6067076-0f')#033[00m
Jan 20 09:29:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395-userdata-shm.mount: Deactivated successfully.
Jan 20 09:29:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay-693c57f2e466814459bb967b5b8379f5bf0326b3ec16540677e4a65e7bbf1a2c-merged.mount: Deactivated successfully.
Jan 20 09:29:15 np0005588919 podman[238303]: 2026-01-20 14:29:15.138176318 +0000 UTC m=+0.248217530 container cleanup ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:29:15 np0005588919 systemd[1]: libpod-conmon-ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395.scope: Deactivated successfully.
Jan 20 09:29:15 np0005588919 podman[238362]: 2026-01-20 14:29:15.232402697 +0000 UTC m=+0.052829377 container remove ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.240 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3d89f1a2-90a0-4378-8ee8-65350243bfa5]: (4, ('Tue Jan 20 02:29:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4 (ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395)\ned48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395\nTue Jan 20 02:29:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4 (ed48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395)\ned48cc542c051179becb0bef90fd036c02c3a10fbac0702dc06ee569e76c9395\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.242 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[de2dc939-daa1-485f-ad86-593d5bb5b9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.242 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08e625c5-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.244 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:15 np0005588919 kernel: tap08e625c5-80: left promiscuous mode
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.268 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f6261943-daee-4790-9879-6e39edc7604e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.282 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6a176e0d-a0d5-4d48-9481-81131cac3d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.283 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62981dca-71da-4769-9f6c-5b998bbeb9c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.306 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[71498d26-243e-42eb-a792-5f2d817faa3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430503, 'reachable_time': 36585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238377, 'error': None, 'target': 'ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 systemd[1]: run-netns-ovnmeta\x2d08e625c5\x2d899c\x2d442a\x2d8ef4\x2d9a3c96892de4.mount: Deactivated successfully.
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.309 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08e625c5-899c-442a-8ef4-9a3c96892de4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.309 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf93d36-9e73-47b6-83ae-840a46422c30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.312 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e6067076-0f97-4e9c-9355-353277570e11 in datapath 14f18b27-1594-48d8-a08b-a930f7adbc08 unbound from our chassis#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.314 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14f18b27-1594-48d8-a08b-a930f7adbc08, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.315 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aed04eb8-1b06-401b-b529-77f7e472b112]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.316 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08 namespace which is not needed anymore#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.400 225859 DEBUG nova.compute.manager [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Received event network-vif-unplugged-e6067076-0f97-4e9c-9355-353277570e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.401 225859 DEBUG oslo_concurrency.lockutils [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.402 225859 DEBUG oslo_concurrency.lockutils [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.402 225859 DEBUG oslo_concurrency.lockutils [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.402 225859 DEBUG nova.compute.manager [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] No waiting events found dispatching network-vif-unplugged-e6067076-0f97-4e9c-9355-353277570e11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.402 225859 DEBUG nova.compute.manager [req-7e1cc6f2-a0e4-423e-894a-7c9081463568 req-de383624-7210-42c5-95aa-7f302944c22c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Received event network-vif-unplugged-e6067076-0f97-4e9c-9355-353277570e11 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.454 225859 INFO nova.virt.libvirt.driver [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Deleting instance files /var/lib/nova/instances/d726266f-b9a6-406b-ad13-f9db3e0dc6aa_del#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.455 225859 INFO nova.virt.libvirt.driver [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Deletion of /var/lib/nova/instances/d726266f-b9a6-406b-ad13-f9db3e0dc6aa_del complete#033[00m
Jan 20 09:29:15 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [NOTICE]   (236644) : haproxy version is 2.8.14-c23fe91
Jan 20 09:29:15 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [NOTICE]   (236644) : path to executable is /usr/sbin/haproxy
Jan 20 09:29:15 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [WARNING]  (236644) : Exiting Master process...
Jan 20 09:29:15 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [ALERT]    (236644) : Current worker (236646) exited with code 143 (Terminated)
Jan 20 09:29:15 np0005588919 neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08[236640]: [WARNING]  (236644) : All workers exited. Exiting... (0)
Jan 20 09:29:15 np0005588919 systemd[1]: libpod-cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042.scope: Deactivated successfully.
Jan 20 09:29:15 np0005588919 podman[238395]: 2026-01-20 14:29:15.510593388 +0000 UTC m=+0.062349314 container died cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.540 225859 INFO nova.compute.manager [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Took 0.84 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.541 225859 DEBUG oslo.service.loopingcall [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.542 225859 DEBUG nova.compute.manager [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.542 225859 DEBUG nova.network.neutron [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:29:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042-userdata-shm.mount: Deactivated successfully.
Jan 20 09:29:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay-2676fe810d64d27697c2b84fe44f9ab65fef13401e606b1abe81b75e60236b87-merged.mount: Deactivated successfully.
Jan 20 09:29:15 np0005588919 podman[238395]: 2026-01-20 14:29:15.554108511 +0000 UTC m=+0.105864447 container cleanup cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:29:15 np0005588919 systemd[1]: libpod-conmon-cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042.scope: Deactivated successfully.
Jan 20 09:29:15 np0005588919 podman[238424]: 2026-01-20 14:29:15.63978024 +0000 UTC m=+0.054424851 container remove cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.646 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ca283b-9baa-42e7-bc22-7d22f58d15f5]: (4, ('Tue Jan 20 02:29:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08 (cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042)\ncb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042\nTue Jan 20 02:29:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08 (cb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042)\ncb549f3dfd3a2168af12352c43b7f75cb0e219d08716514e5f07f4b105589042\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.648 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d19e8391-8e2f-43cf-8e32-c2987233e00f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.649 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f18b27-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.651 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:15 np0005588919 kernel: tap14f18b27-10: left promiscuous mode
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.668 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:15 np0005588919 nova_compute[225855]: 2026-01-20 14:29:15.669 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.671 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f80fb3-e9db-469c-a2ca-b9071566a9e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.692 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[87f2cabc-f325-4c6d-84f5-3bd8fa6bd233]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.694 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[acc53794-64d4-4d74-a486-23895cc45456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.725 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf02f34-ad9d-42ff-bc30-608646578f98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430592, 'reachable_time': 24162, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238440, 'error': None, 'target': 'ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.727 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14f18b27-1594-48d8-a08b-a930f7adbc08 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:29:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:15.727 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d90d7eae-e5e0-4f8f-a116-0a6eb0fc5827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:29:16 np0005588919 systemd[1]: run-netns-ovnmeta\x2d14f18b27\x2d1594\x2d48d8\x2da08b\x2da930f7adbc08.mount: Deactivated successfully.
Jan 20 09:29:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:16.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.367 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 20 09:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:16.388 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:16.389 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:16.389 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.582 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.582 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.582 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.583 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:16 np0005588919 nova_compute[225855]: 2026-01-20 14:29:16.759 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:16.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.174 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919342.172607, 79b5596e-43c9-4085-9829-454fecf59490 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.176 225859 INFO nova.compute.manager [-] [instance: 79b5596e-43c9-4085-9829-454fecf59490] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.195 225859 DEBUG nova.compute.manager [None req-e4966053-ba2b-41d3-b2e0-e6f392d201f8 - - - - - -] [instance: 79b5596e-43c9-4085-9829-454fecf59490] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.465 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.482 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.483 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.543 225859 DEBUG nova.compute.manager [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Received event network-vif-plugged-e6067076-0f97-4e9c-9355-353277570e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.543 225859 DEBUG oslo_concurrency.lockutils [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.544 225859 DEBUG oslo_concurrency.lockutils [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.544 225859 DEBUG oslo_concurrency.lockutils [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.544 225859 DEBUG nova.compute.manager [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] No waiting events found dispatching network-vif-plugged-e6067076-0f97-4e9c-9355-353277570e11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.545 225859 WARNING nova.compute.manager [req-a80e6f7f-b9fe-4e9d-9720-1081061a09b4 req-92e22e9c-6090-45e1-a6a9-2129992d6e01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Received unexpected event network-vif-plugged-e6067076-0f97-4e9c-9355-353277570e11 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.546 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.547 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.547 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.547 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.548 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.773 225859 DEBUG nova.network.neutron [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.788 225859 INFO nova.compute.manager [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Took 2.25 seconds to deallocate network for instance.#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.833 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.834 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.835 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919342.833644, 87fe16d6-774e-4002-8df4-9eb202621ab9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.835 225859 INFO nova.compute.manager [-] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.867 225859 DEBUG nova.compute.manager [None req-81758240-e53e-4bb6-b90e-3e322c35b46f - - - - - -] [instance: 87fe16d6-774e-4002-8df4-9eb202621ab9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:17 np0005588919 nova_compute[225855]: 2026-01-20 14:29:17.910 225859 DEBUG oslo_concurrency.processutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/771574584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.022 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.114 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.115 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.313 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.315 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4553MB free_disk=20.838512420654297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.316 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:18.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.364 225859 DEBUG oslo_concurrency.processutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.370 225859 DEBUG nova.compute.provider_tree [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.389 225859 DEBUG nova.scheduler.client.report [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.439 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.444 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.518 225859 INFO nova.scheduler.client.report [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Deleted allocations for instance d726266f-b9a6-406b-ad13-f9db3e0dc6aa#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.544 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.545 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.545 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.610 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.654 225859 DEBUG oslo_concurrency.lockutils [None req-aaff5fb5-a6be-4311-ae7f-a5adbf7cc1e2 bce7fcbd19554e29bb80c5b93b7dd3c9 d15f60b9e48e4175b5520d1e57ed2d3a - - default default] Lock "d726266f-b9a6-406b-ad13-f9db3e0dc6aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:18 np0005588919 nova_compute[225855]: 2026-01-20 14:29:18.772 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:18.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1452425071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.026 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.031 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.060 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.132 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.133 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.979 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.988 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:19 np0005588919 nova_compute[225855]: 2026-01-20 14:29:19.988 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:29:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:20.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:20 np0005588919 nova_compute[225855]: 2026-01-20 14:29:20.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:20 np0005588919 nova_compute[225855]: 2026-01-20 14:29:20.338 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:29:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:20.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 20 09:29:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:22.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:22.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:23 np0005588919 nova_compute[225855]: 2026-01-20 14:29:23.814 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:24.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:24.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:25 np0005588919 nova_compute[225855]: 2026-01-20 14:29:25.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:26.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.395165) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366395213, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2463, "num_deletes": 256, "total_data_size": 5766159, "memory_usage": 5862912, "flush_reason": "Manual Compaction"}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366416445, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3707485, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25425, "largest_seqno": 27883, "table_properties": {"data_size": 3697574, "index_size": 6213, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21636, "raw_average_key_size": 20, "raw_value_size": 3677289, "raw_average_value_size": 3539, "num_data_blocks": 273, "num_entries": 1039, "num_filter_entries": 1039, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919183, "oldest_key_time": 1768919183, "file_creation_time": 1768919366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21351 microseconds, and 8039 cpu microseconds.
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.416509) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3707485 bytes OK
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.416534) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.423820) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.423847) EVENT_LOG_v1 {"time_micros": 1768919366423838, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.423910) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5755142, prev total WAL file size 5755142, number of live WAL files 2.
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.426179) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3620KB)], [51(8856KB)]
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366426246, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 12776082, "oldest_snapshot_seqno": -1}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5465 keys, 10745733 bytes, temperature: kUnknown
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366495289, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 10745733, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10707139, "index_size": 23828, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13701, "raw_key_size": 137024, "raw_average_key_size": 25, "raw_value_size": 10606575, "raw_average_value_size": 1940, "num_data_blocks": 980, "num_entries": 5465, "num_filter_entries": 5465, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.495533) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 10745733 bytes
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.497015) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.9 rd, 155.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.6 +0.0 blob) out(10.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.9) OK, records in: 5996, records dropped: 531 output_compression: NoCompression
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.497040) EVENT_LOG_v1 {"time_micros": 1768919366497027, "job": 30, "event": "compaction_finished", "compaction_time_micros": 69115, "compaction_time_cpu_micros": 23127, "output_level": 6, "num_output_files": 1, "total_output_size": 10745733, "num_input_records": 5996, "num_output_records": 5465, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366498025, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919366499853, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.426063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.499956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.499961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.499963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.499965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:26.499967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:26.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:28.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:28 np0005588919 nova_compute[225855]: 2026-01-20 14:29:28.848 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:28.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.764 225859 DEBUG nova.compute.manager [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:29:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.856 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.856 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.891 225859 DEBUG nova.objects.instance [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.906 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.907 225859 INFO nova.compute.claims [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.907 225859 DEBUG nova.objects.instance [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.922 225859 DEBUG nova.objects.instance [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.944 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919354.9443636, d726266f-b9a6-406b-ad13-f9db3e0dc6aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.945 225859 INFO nova.compute.manager [-] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.976 225859 DEBUG nova.compute.manager [None req-653c3fd6-92c6-4b47-9117-13c5a68ff91f - - - - - -] [instance: d726266f-b9a6-406b-ad13-f9db3e0dc6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.981 225859 INFO nova.compute.resource_tracker [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Updating resource usage from migration 9ed1b4f4-9705-4902-bd56-a18b9866cbf3#033[00m
Jan 20 09:29:29 np0005588919 nova_compute[225855]: 2026-01-20 14:29:29.982 225859 DEBUG nova.compute.resource_tracker [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Starting to track incoming migration 9ed1b4f4-9705-4902-bd56-a18b9866cbf3 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.027 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.067 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:30.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/650463931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.545 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.554 225859 DEBUG nova.compute.provider_tree [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.571 225859 DEBUG nova.scheduler.client.report [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.604 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:30 np0005588919 nova_compute[225855]: 2026-01-20 14:29:30.605 225859 INFO nova.compute.manager [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Migrating#033[00m
Jan 20 09:29:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:30.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:32 np0005588919 systemd-logind[783]: New session 54 of user nova.
Jan 20 09:29:32 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:29:32 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:29:32 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:29:32 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:29:32 np0005588919 systemd[238594]: Queued start job for default target Main User Target.
Jan 20 09:29:32 np0005588919 systemd[238594]: Created slice User Application Slice.
Jan 20 09:29:32 np0005588919 systemd[238594]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:29:32 np0005588919 systemd[238594]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:29:32 np0005588919 systemd[238594]: Reached target Paths.
Jan 20 09:29:32 np0005588919 systemd[238594]: Reached target Timers.
Jan 20 09:29:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:32.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:32 np0005588919 systemd[238594]: Starting D-Bus User Message Bus Socket...
Jan 20 09:29:32 np0005588919 systemd[238594]: Starting Create User's Volatile Files and Directories...
Jan 20 09:29:32 np0005588919 systemd[238594]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:29:32 np0005588919 systemd[238594]: Reached target Sockets.
Jan 20 09:29:32 np0005588919 systemd[238594]: Finished Create User's Volatile Files and Directories.
Jan 20 09:29:32 np0005588919 systemd[238594]: Reached target Basic System.
Jan 20 09:29:32 np0005588919 systemd[238594]: Reached target Main User Target.
Jan 20 09:29:32 np0005588919 systemd[238594]: Startup finished in 164ms.
Jan 20 09:29:32 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:29:32 np0005588919 podman[238592]: 2026-01-20 14:29:32.361669078 +0000 UTC m=+0.184185660 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:29:32 np0005588919 systemd[1]: Started Session 54 of User nova.
Jan 20 09:29:32 np0005588919 systemd[1]: session-54.scope: Deactivated successfully.
Jan 20 09:29:32 np0005588919 systemd-logind[783]: Session 54 logged out. Waiting for processes to exit.
Jan 20 09:29:32 np0005588919 systemd-logind[783]: Removed session 54.
Jan 20 09:29:32 np0005588919 systemd-logind[783]: New session 56 of user nova.
Jan 20 09:29:32 np0005588919 systemd[1]: Started Session 56 of User nova.
Jan 20 09:29:32 np0005588919 systemd[1]: session-56.scope: Deactivated successfully.
Jan 20 09:29:32 np0005588919 systemd-logind[783]: Session 56 logged out. Waiting for processes to exit.
Jan 20 09:29:32 np0005588919 systemd-logind[783]: Removed session 56.
Jan 20 09:29:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:32.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:33 np0005588919 nova_compute[225855]: 2026-01-20 14:29:33.850 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:34.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:34.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:35 np0005588919 nova_compute[225855]: 2026-01-20 14:29:35.029 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:29:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:36.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:29:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:36.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:38.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:38 np0005588919 nova_compute[225855]: 2026-01-20 14:29:38.852 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:38.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:40 np0005588919 nova_compute[225855]: 2026-01-20 14:29:40.066 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:40.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:41.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:41 np0005588919 nova_compute[225855]: 2026-01-20 14:29:41.327 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:41.328 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:29:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:41.329 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:29:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:42.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:42 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:29:42 np0005588919 systemd[238594]: Activating special unit Exit the Session...
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped target Main User Target.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped target Basic System.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped target Paths.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped target Sockets.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped target Timers.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:29:42 np0005588919 systemd[238594]: Closed D-Bus User Message Bus Socket.
Jan 20 09:29:42 np0005588919 systemd[238594]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:29:42 np0005588919 systemd[238594]: Removed slice User Application Slice.
Jan 20 09:29:42 np0005588919 systemd[238594]: Reached target Shutdown.
Jan 20 09:29:42 np0005588919 systemd[238594]: Finished Exit the Session.
Jan 20 09:29:42 np0005588919 systemd[238594]: Reached target Exit the Session.
Jan 20 09:29:42 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:29:42 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:29:42 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:29:42 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:29:42 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:29:42 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:29:42 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:29:42 np0005588919 podman[238697]: 2026-01-20 14:29:42.760771467 +0000 UTC m=+0.058421214 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 20 09:29:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:43.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:43 np0005588919 nova_compute[225855]: 2026-01-20 14:29:43.888 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:44.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:45.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:45 np0005588919 nova_compute[225855]: 2026-01-20 14:29:45.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.167318) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386167509, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 433, "num_deletes": 255, "total_data_size": 518289, "memory_usage": 527672, "flush_reason": "Manual Compaction"}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386173931, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 342211, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27889, "largest_seqno": 28316, "table_properties": {"data_size": 339776, "index_size": 535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5437, "raw_average_key_size": 17, "raw_value_size": 335011, "raw_average_value_size": 1053, "num_data_blocks": 24, "num_entries": 318, "num_filter_entries": 318, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919366, "oldest_key_time": 1768919366, "file_creation_time": 1768919386, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 6666 microseconds, and 2360 cpu microseconds.
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.173992) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 342211 bytes OK
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.174011) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175543) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175559) EVENT_LOG_v1 {"time_micros": 1768919386175554, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175578) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 515567, prev total WAL file size 515567, number of live WAL files 2.
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.176115) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353039' seq:72057594037927935, type:22 .. '6C6F676D00373630' seq:0, type:0; will stop at (end)
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(334KB)], [54(10MB)]
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386176177, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 11087944, "oldest_snapshot_seqno": -1}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5265 keys, 10980131 bytes, temperature: kUnknown
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386281146, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 10980131, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10941988, "index_size": 23889, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13189, "raw_key_size": 134034, "raw_average_key_size": 25, "raw_value_size": 10844074, "raw_average_value_size": 2059, "num_data_blocks": 979, "num_entries": 5265, "num_filter_entries": 5265, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919386, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.281473) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 10980131 bytes
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283146) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.5 rd, 104.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 10.2 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(64.5) write-amplify(32.1) OK, records in: 5783, records dropped: 518 output_compression: NoCompression
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.283177) EVENT_LOG_v1 {"time_micros": 1768919386283164, "job": 32, "event": "compaction_finished", "compaction_time_micros": 105071, "compaction_time_cpu_micros": 25804, "output_level": 6, "num_output_files": 1, "total_output_size": 10980131, "num_input_records": 5783, "num_output_records": 5265, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386283449, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919386286989, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.175995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.287104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.287109) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.287111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.287112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:29:46.287114) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:29:46 np0005588919 nova_compute[225855]: 2026-01-20 14:29:46.291 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:46 np0005588919 nova_compute[225855]: 2026-01-20 14:29:46.292 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:46 np0005588919 nova_compute[225855]: 2026-01-20 14:29:46.292 225859 DEBUG nova.network.neutron [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:29:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:46.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:46 np0005588919 nova_compute[225855]: 2026-01-20 14:29:46.525 225859 DEBUG nova.network.neutron [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:47.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.101 225859 DEBUG nova.network.neutron [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.117 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.254 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.257 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.257 225859 INFO nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Creating image(s)#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.304 225859 DEBUG nova.storage.rbd_utils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] creating snapshot(nova-resize) on rbd image(9f5c9253-e2bd-42d3-8253-fac568daeda7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:29:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.664 225859 DEBUG nova.objects.instance [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.768 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.768 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Ensure instance console log exists: /var/lib/nova/instances/9f5c9253-e2bd-42d3-8253-fac568daeda7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.769 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.770 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.770 225859 DEBUG oslo_concurrency.lockutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.772 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.778 225859 WARNING nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.787 225859 DEBUG nova.virt.libvirt.host [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.787 225859 DEBUG nova.virt.libvirt.host [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.791 225859 DEBUG nova.virt.libvirt.host [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.791 225859 DEBUG nova.virt.libvirt.host [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.793 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.793 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.794 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.794 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.794 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.795 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.795 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.795 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.796 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.796 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.796 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.797 225859 DEBUG nova.virt.hardware [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.797 225859 DEBUG nova.objects.instance [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:47 np0005588919 nova_compute[225855]: 2026-01-20 14:29:47.812 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3115653324' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.240 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.278 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:29:48.332 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:29:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:29:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2245526178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.761 225859 DEBUG oslo_concurrency.processutils [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.765 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <uuid>9f5c9253-e2bd-42d3-8253-fac568daeda7</uuid>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <name>instance-0000001b</name>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-MigrationsAdminTest-server-326963183</nova:name>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:29:47</nova:creationTime>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="serial">9f5c9253-e2bd-42d3-8253-fac568daeda7</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="uuid">9f5c9253-e2bd-42d3-8253-fac568daeda7</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/9f5c9253-e2bd-42d3-8253-fac568daeda7_disk">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/9f5c9253-e2bd-42d3-8253-fac568daeda7_disk.config">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/9f5c9253-e2bd-42d3-8253-fac568daeda7/console.log" append="off"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:29:48 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:29:48 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:29:48 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:29:48 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.850 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.850 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.851 225859 INFO nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Using config drive#033[00m
Jan 20 09:29:48 np0005588919 nova_compute[225855]: 2026-01-20 14:29:48.889 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:48 np0005588919 systemd-machined[194361]: New machine qemu-13-instance-0000001b.
Jan 20 09:29:48 np0005588919 systemd[1]: Started Virtual Machine qemu-13-instance-0000001b.
Jan 20 09:29:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:49.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.587 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919389.586829, 9f5c9253-e2bd-42d3-8253-fac568daeda7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.588 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.590 225859 DEBUG nova.compute.manager [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.594 225859 INFO nova.virt.libvirt.driver [-] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance running successfully.#033[00m
Jan 20 09:29:49 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.596 225859 DEBUG nova.virt.libvirt.guest [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.597 225859 DEBUG nova.virt.libvirt.driver [None req-69535ae1-0353-42a7-8c9d-2289163476c0 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.619 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.623 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.652 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.653 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919389.588149, 9f5c9253-e2bd-42d3-8253-fac568daeda7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.654 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] VM Started (Lifecycle Event)#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.702 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.705 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:29:49 np0005588919 nova_compute[225855]: 2026-01-20 14:29:49.749 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:29:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:50 np0005588919 nova_compute[225855]: 2026-01-20 14:29:50.117 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:50.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:51.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:52 np0005588919 nova_compute[225855]: 2026-01-20 14:29:52.104 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:29:52 np0005588919 nova_compute[225855]: 2026-01-20 14:29:52.104 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:29:52 np0005588919 nova_compute[225855]: 2026-01-20 14:29:52.104 225859 DEBUG nova.network.neutron [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:29:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:52.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:52 np0005588919 nova_compute[225855]: 2026-01-20 14:29:52.501 225859 DEBUG nova.network.neutron [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:29:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:53.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:53 np0005588919 nova_compute[225855]: 2026-01-20 14:29:53.448 225859 DEBUG nova.network.neutron [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:29:53 np0005588919 nova_compute[225855]: 2026-01-20 14:29:53.892 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:54.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.384 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-9f5c9253-e2bd-42d3-8253-fac568daeda7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:29:54 np0005588919 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 20 09:29:54 np0005588919 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001b.scope: Consumed 5.640s CPU time.
Jan 20 09:29:54 np0005588919 systemd-machined[194361]: Machine qemu-13-instance-0000001b terminated.
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.630 225859 INFO nova.virt.libvirt.driver [-] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Instance destroyed successfully.#033[00m
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.631 225859 DEBUG nova.objects.instance [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.649 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.650 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.668 225859 DEBUG nova.objects.instance [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid 9f5c9253-e2bd-42d3-8253-fac568daeda7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:29:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:29:54 np0005588919 nova_compute[225855]: 2026-01-20 14:29:54.797 225859 DEBUG oslo_concurrency.processutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:29:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:55.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:55 np0005588919 nova_compute[225855]: 2026-01-20 14:29:55.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:29:55 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3861732727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:29:55 np0005588919 nova_compute[225855]: 2026-01-20 14:29:55.325 225859 DEBUG oslo_concurrency.processutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:29:55 np0005588919 nova_compute[225855]: 2026-01-20 14:29:55.333 225859 DEBUG nova.compute.provider_tree [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:29:55 np0005588919 nova_compute[225855]: 2026-01-20 14:29:55.361 225859 DEBUG nova.scheduler.client.report [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:29:55 np0005588919 nova_compute[225855]: 2026-01-20 14:29:55.436 225859 DEBUG oslo_concurrency.lockutils [None req-5125edda-6367-41ce-aa7c-33caccb49490 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:29:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:56.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:57.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:29:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:29:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:29:58 np0005588919 nova_compute[225855]: 2026-01-20 14:29:58.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:29:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:29:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:29:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:29:59.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:29:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 20 09:29:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:30:00 np0005588919 nova_compute[225855]: 2026-01-20 14:30:00.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:00.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:01.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:02 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:02Z|00108|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 09:30:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:02.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:03.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:03 np0005588919 podman[239013]: 2026-01-20 14:30:03.084806497 +0000 UTC m=+0.115055886 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:30:03 np0005588919 nova_compute[225855]: 2026-01-20 14:30:03.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:04.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:05.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:05 np0005588919 nova_compute[225855]: 2026-01-20 14:30:05.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:06.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 20 09:30:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:07.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:07 np0005588919 nova_compute[225855]: 2026-01-20 14:30:07.674 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:07 np0005588919 nova_compute[225855]: 2026-01-20 14:30:07.675 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:07 np0005588919 nova_compute[225855]: 2026-01-20 14:30:07.842 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.196 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.197 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.206 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.207 225859 INFO nova.compute.claims [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:30:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:08.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.561 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:08 np0005588919 nova_compute[225855]: 2026-01-20 14:30:08.985 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4260148581' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:30:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:09.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.058 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.066 225859 DEBUG nova.compute.provider_tree [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.167 225859 DEBUG nova.scheduler.client.report [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.289 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.290 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.455 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.456 225859 DEBUG nova.network.neutron [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.572 225859 INFO nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.618 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.628 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919394.6276348, 9f5c9253-e2bd-42d3-8253-fac568daeda7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.629 225859 INFO nova.compute.manager [-] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.683 225859 DEBUG nova.compute.manager [None req-940dff08-c42c-4fc8-a1c6-374c085d1c4c - - - - - -] [instance: 9f5c9253-e2bd-42d3-8253-fac568daeda7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.849 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.854 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.855 225859 INFO nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Creating image(s)#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.898 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.943 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.970 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:09 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.974 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:09.999 225859 DEBUG nova.network.neutron [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.000 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.054 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.055 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.056 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.056 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.083 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.087 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d95ca690-20e1-4b0c-919b-d64c9af25eba_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:10 np0005588919 nova_compute[225855]: 2026-01-20 14:30:10.160 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:10.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:30:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:30:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:30:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:11.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:12 np0005588919 nova_compute[225855]: 2026-01-20 14:30:12.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:12.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:12 np0005588919 nova_compute[225855]: 2026-01-20 14:30:12.946 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d95ca690-20e1-4b0c-919b-d64c9af25eba_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.859s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.027 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.028 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.035 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] resizing rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:30:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:13.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.089 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.089 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:13 np0005588919 podman[239291]: 2026-01-20 14:30:13.097372811 +0000 UTC m=+0.108980885 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.154 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.157 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.544 225859 DEBUG nova.objects.instance [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'migration_context' on Instance uuid d95ca690-20e1-4b0c-919b-d64c9af25eba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.577 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.577 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Ensure instance console log exists: /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.579 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.579 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.580 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.582 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.592 225859 WARNING nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.600 225859 DEBUG nova.virt.libvirt.host [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.600 225859 DEBUG nova.virt.libvirt.host [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.604 225859 DEBUG nova.virt.libvirt.host [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.604 225859 DEBUG nova.virt.libvirt.host [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.606 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.606 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.606 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.607 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.607 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.607 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.607 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.608 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.608 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.608 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.608 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.609 225859 DEBUG nova.virt.hardware [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.612 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.638 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.639 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.641 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.649 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.650 225859 INFO nova.compute.claims [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.823 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:13 np0005588919 nova_compute[225855]: 2026-01-20 14:30:13.987 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1567150312' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.111 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.138 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.141 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1108007621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.297 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.304 225859 DEBUG nova.compute.provider_tree [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.327 225859 DEBUG nova.scheduler.client.report [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.352 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.353 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.355 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.361 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.362 225859 INFO nova.compute.claims [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:30:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:14.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.595 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.596 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3859938377' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.637 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.638 225859 DEBUG nova.objects.instance [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'pci_devices' on Instance uuid d95ca690-20e1-4b0c-919b-d64c9af25eba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.687 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <uuid>d95ca690-20e1-4b0c-919b-d64c9af25eba</uuid>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <name>instance-0000001d</name>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:name>tempest-MigrationsAdminTest-server-1542965426</nova:name>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:30:13</nova:creationTime>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="serial">d95ca690-20e1-4b0c-919b-d64c9af25eba</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="uuid">d95ca690-20e1-4b0c-919b-d64c9af25eba</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d95ca690-20e1-4b0c-919b-d64c9af25eba_disk">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/console.log" append="off"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:30:14 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:30:14 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:30:14 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:30:14 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.708 225859 INFO nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.746 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.753 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.753 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.753 225859 INFO nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Using config drive#033[00m
Jan 20 09:30:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.779 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.803 225859 DEBUG nova.policy [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cec872a00f742d78563d6d16fc545cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f151250c04467bb4f6a229dda16fc5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.986 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.987 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:30:14 np0005588919 nova_compute[225855]: 2026-01-20 14:30:14.988 225859 INFO nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Creating image(s)#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.011 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.039 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:15.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.067 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.071 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.099 225859 INFO nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Creating config drive at /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.107 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_hdohpe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.130 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.151 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.152 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.153 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.153 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.173 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.176 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d08682f8-72ef-462c-b4b7-044cf16fc193_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.196 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.236 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_hdohpe" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.270 225859 DEBUG nova.storage.rbd_utils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rbd image d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.274 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.425 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d08682f8-72ef-462c-b4b7-044cf16fc193_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.456 225859 DEBUG oslo_concurrency.processutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.457 225859 INFO nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Deleting local config drive /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/disk.config because it was imported into RBD.#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.499 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] resizing rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:30:15 np0005588919 systemd-machined[194361]: New machine qemu-14-instance-0000001d.
Jan 20 09:30:15 np0005588919 systemd[1]: Started Virtual Machine qemu-14-instance-0000001d.
Jan 20 09:30:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3056334551' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.564 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.573 225859 DEBUG nova.compute.provider_tree [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.602 225859 DEBUG nova.scheduler.client.report [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.636 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.637 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.644 225859 DEBUG nova.objects.instance [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'migration_context' on Instance uuid d08682f8-72ef-462c-b4b7-044cf16fc193 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.681 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.681 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Ensure instance console log exists: /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.682 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.682 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.683 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.744 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.745 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.811 225859 INFO nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.913 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:30:15 np0005588919 nova_compute[225855]: 2026-01-20 14:30:15.917 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Successfully created port: e848e00f-d594-4d70-9026-cd650417bf47 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.051 225859 DEBUG nova.policy [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a3fbc3f92a849e88cbf34d28ca17e43', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0cee74dd60da4a839bb5eb0ba3137edf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.273 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.275 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.275 225859 INFO nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Creating image(s)#033[00m
Jan 20 09:30:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:16.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:16.389 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:16.390 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:16.390 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.477 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.505 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.530 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.533 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.559 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.559 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.559 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.578 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.578 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.578 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.608 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.608 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.609 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.609 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.635 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.638 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f3faf996-e066-4b11-b7f3-30aeffff726e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.770 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.770 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.770 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.770 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.841 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919416.8409173, d95ca690-20e1-4b0c-919b-d64c9af25eba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.841 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.843 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.843 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.850 225859 INFO nova.virt.libvirt.driver [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance spawned successfully.#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.851 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.873 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.878 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.883 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.883 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.884 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.884 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.885 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.885 225859 DEBUG nova.virt.libvirt.driver [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.903 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Successfully updated port: e848e00f-d594-4d70-9026-cd650417bf47 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.926 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.926 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919416.841072, d95ca690-20e1-4b0c-919b-d64c9af25eba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.927 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.940 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.940 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquired lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.940 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.972 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.974 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.982 225859 INFO nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Took 7.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.982 225859 DEBUG nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:16 np0005588919 nova_compute[225855]: 2026-01-20 14:30:16.984 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f3faf996-e066-4b11-b7f3-30aeffff726e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.012 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.048 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] resizing rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:30:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:17.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.140 225859 DEBUG nova.compute.manager [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-changed-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.141 225859 DEBUG nova.compute.manager [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing instance network info cache due to event network-changed-e848e00f-d594-4d70-9026-cd650417bf47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.141 225859 DEBUG oslo_concurrency.lockutils [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.147 225859 DEBUG nova.objects.instance [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'migration_context' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.164 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.170 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.170 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Ensure instance console log exists: /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.170 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.170 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.171 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.247 225859 INFO nova.compute.manager [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Took 9.10 seconds to build instance.#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.313 225859 DEBUG oslo_concurrency.lockutils [None req-9274dcb1-dd36-4dea-af60-f65dbc3dda50 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.520 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.765 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.772 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Successfully created port: f65050ac-6a44-490a-b4b9-8c82c1f61630 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.779 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.779 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.779 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.780 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.780 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.799 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.800 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.800 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.800 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:30:17 np0005588919 nova_compute[225855]: 2026-01-20 14:30:17.800 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.256 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.331 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.332 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000016 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.336 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.336 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:18.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.515 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.516 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4401MB free_disk=20.73217010498047GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.517 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.517 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.592 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.593 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance d95ca690-20e1-4b0c-919b-d64c9af25eba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.593 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance f3faf996-e066-4b11-b7f3-30aeffff726e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.593 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance d08682f8-72ef-462c-b4b7-044cf16fc193 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.594 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.594 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.644 225859 DEBUG nova.network.neutron [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updating instance_info_cache with network_info: [{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.664 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Releasing lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.665 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Instance network_info: |[{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.666 225859 DEBUG oslo_concurrency.lockutils [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.667 225859 DEBUG nova.network.neutron [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing network info cache for port e848e00f-d594-4d70-9026-cd650417bf47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.672 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Start _get_guest_xml network_info=[{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.682 225859 WARNING nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.687 225859 DEBUG nova.virt.libvirt.host [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.688 225859 DEBUG nova.virt.libvirt.host [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.693 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.737 225859 DEBUG nova.virt.libvirt.host [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.739 225859 DEBUG nova.virt.libvirt.host [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.741 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.741 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.742 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.742 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.743 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.743 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.743 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.744 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.744 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.745 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.745 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.745 225859 DEBUG nova.virt.hardware [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.749 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.773 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Successfully updated port: f65050ac-6a44-490a-b4b9-8c82c1f61630 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.789 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.790 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquired lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.790 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.949 225859 DEBUG nova.compute.manager [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-changed-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.950 225859 DEBUG nova.compute.manager [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Refreshing instance network info cache due to event network-changed-f65050ac-6a44-490a-b4b9-8c82c1f61630. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:18 np0005588919 nova_compute[225855]: 2026-01-20 14:30:18.951 225859 DEBUG oslo_concurrency.lockutils [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:19.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.137 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2148169791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.160 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.166 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2203175762' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.189 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.211 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.214 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.236 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.275 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.276 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3241562052' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.629 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.630 225859 DEBUG nova.virt.libvirt.vif [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-952138345',display_name='tempest-FloatingIPsAssociationTestJSON-server-952138345',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-952138345',id=31,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-bkzlkdyv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:30:14Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=d08682f8-72ef-462c-b4b7-044cf16fc193,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.631 225859 DEBUG nova.network.os_vif_util [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.632 225859 DEBUG nova.network.os_vif_util [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.633 225859 DEBUG nova.objects.instance [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid d08682f8-72ef-462c-b4b7-044cf16fc193 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.647 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <uuid>d08682f8-72ef-462c-b4b7-044cf16fc193</uuid>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <name>instance-0000001f</name>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-952138345</nova:name>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:30:18</nova:creationTime>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:user uuid="0cec872a00f742d78563d6d16fc545cb">tempest-FloatingIPsAssociationTestJSON-146254261-project-member</nova:user>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:project uuid="78f151250c04467bb4f6a229dda16fc5">tempest-FloatingIPsAssociationTestJSON-146254261</nova:project>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <nova:port uuid="e848e00f-d594-4d70-9026-cd650417bf47">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="serial">d08682f8-72ef-462c-b4b7-044cf16fc193</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="uuid">d08682f8-72ef-462c-b4b7-044cf16fc193</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d08682f8-72ef-462c-b4b7-044cf16fc193_disk">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:89:9f:c4"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <target dev="tape848e00f-d5"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/console.log" append="off"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:30:19 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:30:19 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:30:19 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:30:19 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.649 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Preparing to wait for external event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.650 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.650 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.650 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.651 225859 DEBUG nova.virt.libvirt.vif [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-952138345',display_name='tempest-FloatingIPsAssociationTestJSON-server-952138345',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-952138345',id=31,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-bkzlkdyv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:30:14Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=d08682f8-72ef-462c-b4b7-044cf16fc193,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.651 225859 DEBUG nova.network.os_vif_util [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.652 225859 DEBUG nova.network.os_vif_util [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.652 225859 DEBUG os_vif [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.653 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.653 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.654 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.658 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.658 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape848e00f-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.659 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape848e00f-d5, col_values=(('external_ids', {'iface-id': 'e848e00f-d594-4d70-9026-cd650417bf47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:9f:c4', 'vm-uuid': 'd08682f8-72ef-462c-b4b7-044cf16fc193'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.660 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588919 NetworkManager[49104]: <info>  [1768919419.6620] manager: (tape848e00f-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.662 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.670 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.671 225859 INFO os_vif [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5')#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.720 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.721 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.721 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] No VIF found with MAC fa:16:3e:89:9f:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.722 225859 INFO nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Using config drive#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.751 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.835 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.835 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.835 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:30:19 np0005588919 nova_compute[225855]: 2026-01-20 14:30:19.992 225859 DEBUG nova.network.neutron [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating instance_info_cache with network_info: [{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.010 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Releasing lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.011 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Instance network_info: |[{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.012 225859 DEBUG oslo_concurrency.lockutils [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.013 225859 DEBUG nova.network.neutron [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Refreshing network info cache for port f65050ac-6a44-490a-b4b9-8c82c1f61630 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.018 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Start _get_guest_xml network_info=[{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.024 225859 WARNING nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.028 225859 DEBUG nova.virt.libvirt.host [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.029 225859 DEBUG nova.virt.libvirt.host [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.034 225859 DEBUG nova.virt.libvirt.host [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.035 225859 DEBUG nova.virt.libvirt.host [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.037 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.038 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.039 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.039 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.040 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.041 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.041 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.042 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.043 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.043 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.044 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.044 225859 DEBUG nova.virt.hardware [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.049 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.163 225859 INFO nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Creating config drive at /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.173 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeqdwh14w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.258 225859 DEBUG nova.network.neutron [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updated VIF entry in instance network info cache for port e848e00f-d594-4d70-9026-cd650417bf47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.265 225859 DEBUG nova.network.neutron [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updating instance_info_cache with network_info: [{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.293 225859 DEBUG oslo_concurrency.lockutils [req-a9aeec7f-be7c-489d-bb89-e11c2eb98e93 req-20df941b-e8d7-4122-a27c-481b964e4bca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.305 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeqdwh14w" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.338 225859 DEBUG nova.storage.rbd_utils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] rbd image d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.342 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.367 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.368 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:20.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.399 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:30:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2267082636' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.589 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.633 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:20 np0005588919 nova_compute[225855]: 2026-01-20 14:30:20.639 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:21.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/891766828' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:21 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.391 225859 DEBUG oslo_concurrency.processutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config d08682f8-72ef-462c-b4b7-044cf16fc193_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.393 225859 INFO nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Deleting local config drive /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193/disk.config because it was imported into RBD.#033[00m
Jan 20 09:30:21 np0005588919 kernel: tape848e00f-d5: entered promiscuous mode
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.4445] manager: (tape848e00f-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 20 09:30:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:21Z|00109|binding|INFO|Claiming lport e848e00f-d594-4d70-9026-cd650417bf47 for this chassis.
Jan 20 09:30:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:21Z|00110|binding|INFO|e848e00f-d594-4d70-9026-cd650417bf47: Claiming fa:16:3e:89:9f:c4 10.100.0.5
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.489 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.495 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.502 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.863s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.503 225859 DEBUG nova.virt.libvirt.vif [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1191836092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1191836092',id=30,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+01n3DJe3yYfRmwifZEomZrLtaFilErLasmr7ze/p0n1d6nPaSWQOHrHfJ9ubgBCwoqlwHjFIWrKKyRcRI1f3OIubHCG4LO7UMySAzmCXBSDkLJPz6Qzoln3dTb/xrow==',key_name='tempest-keypair-696534507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0cee74dd60da4a839bb5eb0ba3137edf',ramdisk_id='',reservation_id='r-0tyxczv3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:30:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a3fbc3f92a849e88cbf34d28ca17e43',uuid=f3faf996-e066-4b11-b7f3-30aeffff726e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.505 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:9f:c4 10.100.0.5'], port_security=['fa:16:3e:89:9f:c4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd08682f8-72ef-462c-b4b7-044cf16fc193', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e6deef-9aca-4d36-8215-4517982a86a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f151250c04467bb4f6a229dda16fc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dd36e8d2-993a-4618-8fff-62abafaadfd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86700b79-bb44-47f0-88a5-d4c8eda3acbb, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e848e00f-d594-4d70-9026-cd650417bf47) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.506 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e848e00f-d594-4d70-9026-cd650417bf47 in datapath 01e6deef-9aca-4d36-8215-4517982a86a3 bound to our chassis#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.504 225859 DEBUG nova.network.os_vif_util [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converting VIF {"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.505 225859 DEBUG nova.network.os_vif_util [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.507 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01e6deef-9aca-4d36-8215-4517982a86a3#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.506 225859 DEBUG nova.objects.instance [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'pci_devices' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:21 np0005588919 systemd-machined[194361]: New machine qemu-15-instance-0000001f.
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.517 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b136facf-5f1e-4233-991b-c3111542a368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.518 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01e6deef-91 in ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.519 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01e6deef-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.520 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[57b4d52d-dc1c-46f7-8166-a649eda8219b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.520 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfed9dc-6deb-4416-9d4a-5e4d3551efd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.526 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <uuid>f3faf996-e066-4b11-b7f3-30aeffff726e</uuid>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <name>instance-0000001e</name>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:name>tempest-UpdateMultiattachVolumeNegativeTest-server-1191836092</nova:name>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:30:20</nova:creationTime>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:user uuid="6a3fbc3f92a849e88cbf34d28ca17e43">tempest-UpdateMultiattachVolumeNegativeTest-859917658-project-member</nova:user>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:project uuid="0cee74dd60da4a839bb5eb0ba3137edf">tempest-UpdateMultiattachVolumeNegativeTest-859917658</nova:project>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <nova:port uuid="f65050ac-6a44-490a-b4b9-8c82c1f61630">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="serial">f3faf996-e066-4b11-b7f3-30aeffff726e</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="uuid">f3faf996-e066-4b11-b7f3-30aeffff726e</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f3faf996-e066-4b11-b7f3-30aeffff726e_disk">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:fc:ae:50"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <target dev="tapf65050ac-6a"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/console.log" append="off"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:30:21 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:30:21 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:30:21 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:30:21 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.526 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Preparing to wait for external event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.527 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.527 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.527 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:21 np0005588919 systemd[1]: Started Virtual Machine qemu-15-instance-0000001f.
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.533 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b16ded-db6c-4734-b048-897b3d8a2767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.534 225859 DEBUG nova.virt.libvirt.vif [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1191836092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1191836092',id=30,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+01n3DJe3yYfRmwifZEomZrLtaFilErLasmr7ze/p0n1d6nPaSWQOHrHfJ9ubgBCwoqlwHjFIWrKKyRcRI1f3OIubHCG4LO7UMySAzmCXBSDkLJPz6Qzoln3dTb/xrow==',key_name='tempest-keypair-696534507',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0cee74dd60da4a839bb5eb0ba3137edf',ramdisk_id='',reservation_id='r-0tyxczv3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:30:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a3fbc3f92a849e88cbf34d28ca17e43',uuid=f3faf996-e066-4b11-b7f3-30aeffff726e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.535 225859 DEBUG nova.network.os_vif_util [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converting VIF {"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.535 225859 DEBUG nova.network.os_vif_util [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.536 225859 DEBUG os_vif [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.536 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.537 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.538 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.542 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.542 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf65050ac-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.543 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf65050ac-6a, col_values=(('external_ids', {'iface-id': 'f65050ac-6a44-490a-b4b9-8c82c1f61630', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:ae:50', 'vm-uuid': 'f3faf996-e066-4b11-b7f3-30aeffff726e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.5457] manager: (tapf65050ac-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.550 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:30:21 np0005588919 systemd-udevd[240284]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.558 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ea8044-e2bd-4166-b940-189058460999]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.5682] device (tape848e00f-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.5688] device (tape848e00f-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.582 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.582 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquired lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.582 225859 DEBUG nova.network.neutron [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.586 225859 INFO os_vif [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a')#033[00m
Jan 20 09:30:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:21Z|00111|binding|INFO|Setting lport e848e00f-d594-4d70-9026-cd650417bf47 ovn-installed in OVS
Jan 20 09:30:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:21Z|00112|binding|INFO|Setting lport e848e00f-d594-4d70-9026-cd650417bf47 up in Southbound
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.595 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c9050b-3c46-4b63-9ed1-9e4aa4050c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.6028] manager: (tap01e6deef-90): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Jan 20 09:30:21 np0005588919 systemd-udevd[240292]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.603 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[024d256f-5dd2-46d9-bbf4-935c23bcd510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.630 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0981e1-161d-4071-9157-1d570366d8c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.633 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0adaff-a880-47c6-98b1-c475d6ae0a88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.6524] device (tap01e6deef-90): carrier: link connected
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.658 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[df6c5693-4d8b-4179-bbcb-b9c64bd33c3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.683 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6f346454-e322-4888-ac0e-76ae184fb6ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e6deef-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:81:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446058, 'reachable_time': 15941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240320, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.697 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[73da090c-f7ff-4018-95ba-485308e62d9a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:818c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446058, 'tstamp': 446058}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240321, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.711 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4210faf5-171e-4123-9c2c-134508ff7036]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e6deef-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:81:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446058, 'reachable_time': 15941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240322, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.740 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[86694319-858e-4ffd-bfa3-f550188d29f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.791 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19f4afcd-3694-49b2-9b50-fa4c2296cc06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.792 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e6deef-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.792 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.793 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01e6deef-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 NetworkManager[49104]: <info>  [1768919421.7952] manager: (tap01e6deef-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 20 09:30:21 np0005588919 kernel: tap01e6deef-90: entered promiscuous mode
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.794 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.800 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01e6deef-90, col_values=(('external_ids', {'iface-id': 'b3bfa880-f76c-4bab-98ca-24729b0d77e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.800 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:21Z|00113|binding|INFO|Releasing lport b3bfa880-f76c-4bab-98ca-24729b0d77e7 from this chassis (sb_readonly=0)
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.805 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.805 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.805 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No VIF found with MAC fa:16:3e:fc:ae:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.805 225859 INFO nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Using config drive#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.827 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.828 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0cd508-fcb4-455d-9da9-521ba8d6e7c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.829 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-01e6deef-9aca-4d36-8215-4517982a86a3
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/01e6deef-9aca-4d36-8215-4517982a86a3.pid.haproxy
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 01e6deef-9aca-4d36-8215-4517982a86a3
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:30:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:21.830 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'env', 'PROCESS_TAG=haproxy-01e6deef-9aca-4d36-8215-4517982a86a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01e6deef-9aca-4d36-8215-4517982a86a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.841 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.847 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.963 225859 DEBUG nova.network.neutron [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.968 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919421.968317, d08682f8-72ef-462c-b4b7-044cf16fc193 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.969 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:21 np0005588919 nova_compute[225855]: 2026-01-20 14:30:21.994 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.000 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919421.9706268, d08682f8-72ef-462c-b4b7-044cf16fc193 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.001 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.022 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.026 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.042 225859 DEBUG nova.network.neutron [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updated VIF entry in instance network info cache for port f65050ac-6a44-490a-b4b9-8c82c1f61630. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.042 225859 DEBUG nova.network.neutron [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating instance_info_cache with network_info: [{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.079 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.085 225859 DEBUG oslo_concurrency.lockutils [req-eb3753cc-2879-474d-9930-bd6934878356 req-9cfeba9d-76b7-48d9-8fbf-17ec4c1531b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.269 225859 DEBUG nova.compute.manager [req-b2519775-a15d-4431-8676-ad34523086ce req-5a8dcbbd-835c-4ea2-86ae-07ce63dfd60f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.270 225859 DEBUG oslo_concurrency.lockutils [req-b2519775-a15d-4431-8676-ad34523086ce req-5a8dcbbd-835c-4ea2-86ae-07ce63dfd60f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.270 225859 DEBUG oslo_concurrency.lockutils [req-b2519775-a15d-4431-8676-ad34523086ce req-5a8dcbbd-835c-4ea2-86ae-07ce63dfd60f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.271 225859 DEBUG oslo_concurrency.lockutils [req-b2519775-a15d-4431-8676-ad34523086ce req-5a8dcbbd-835c-4ea2-86ae-07ce63dfd60f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.271 225859 DEBUG nova.compute.manager [req-b2519775-a15d-4431-8676-ad34523086ce req-5a8dcbbd-835c-4ea2-86ae-07ce63dfd60f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Processing event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.273 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.276 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919422.2761977, d08682f8-72ef-462c-b4b7-044cf16fc193 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.277 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.279 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.285 225859 INFO nova.virt.libvirt.driver [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Instance spawned successfully.#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.286 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:30:22 np0005588919 podman[240420]: 2026-01-20 14:30:22.197938504 +0000 UTC m=+0.025662653 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.299 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.304 225859 INFO nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Creating config drive at /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.316 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp291v285o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.359 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.362 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.363 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.363 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.363 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.364 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.364 225859 DEBUG nova.virt.libvirt.driver [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.389 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:22.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:22 np0005588919 podman[240420]: 2026-01-20 14:30:22.408341899 +0000 UTC m=+0.236066028 container create 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.417 225859 INFO nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Took 7.43 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.418 225859 DEBUG nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:22 np0005588919 systemd[1]: Started libpod-conmon-5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6.scope.
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.462 225859 DEBUG nova.network.neutron [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.464 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp291v285o" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:22 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:30:22 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c6a841c35edcc3f8a6f139ec1d95e7afdb44a50568f11f14c28989dc0834ec4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.494 225859 DEBUG nova.storage.rbd_utils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] rbd image f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:30:22 np0005588919 podman[240420]: 2026-01-20 14:30:22.497962199 +0000 UTC m=+0.325686348 container init 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 09:30:22 np0005588919 podman[240420]: 2026-01-20 14:30:22.503977368 +0000 UTC m=+0.331701497 container start 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.505 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.527 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Releasing lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:22 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [NOTICE]   (240460) : New worker (240463) forked
Jan 20 09:30:22 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [NOTICE]   (240460) : Loading success.
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.539 225859 INFO nova.compute.manager [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Took 8.97 seconds to build instance.#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.561 225859 DEBUG oslo_concurrency.lockutils [None req-42bf5750-f887-41a7-8e0f-ec5c640634de 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.615 225859 DEBUG nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.615 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Creating file /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/35613d1c57984e108f84e93a2c7361cd.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.615 225859 DEBUG oslo_concurrency.processutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/35613d1c57984e108f84e93a2c7361cd.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.797 225859 DEBUG oslo_concurrency.processutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config f3faf996-e066-4b11-b7f3-30aeffff726e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.292s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.799 225859 INFO nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Deleting local config drive /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e/disk.config because it was imported into RBD.#033[00m
Jan 20 09:30:22 np0005588919 kernel: tapf65050ac-6a: entered promiscuous mode
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.8482] manager: (tapf65050ac-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.849 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:22Z|00114|binding|INFO|Claiming lport f65050ac-6a44-490a-b4b9-8c82c1f61630 for this chassis.
Jan 20 09:30:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:22Z|00115|binding|INFO|f65050ac-6a44-490a-b4b9-8c82c1f61630: Claiming fa:16:3e:fc:ae:50 10.100.0.8
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.858 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.8612] device (tapf65050ac-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.8618] device (tapf65050ac-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.8702] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.8707] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 20 09:30:22 np0005588919 nova_compute[225855]: 2026-01-20 14:30:22.870 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.876 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:ae:50 10.100.0.8'], port_security=['fa:16:3e:fc:ae:50 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f3faf996-e066-4b11-b7f3-30aeffff726e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0cee74dd60da4a839bb5eb0ba3137edf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e08f10e3-3a95-4e33-b03d-21860ea0dc91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f4fb07a-2698-4a11-a9e3-5a66d678d9d5, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f65050ac-6a44-490a-b4b9-8c82c1f61630) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.877 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f65050ac-6a44-490a-b4b9-8c82c1f61630 in datapath 02f86d1d-5cad-49c5-9004-3de3e4739ad5 bound to our chassis#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.878 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02f86d1d-5cad-49c5-9004-3de3e4739ad5#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.890 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[07d84e0c-5f30-44c1-a901-b590069c115d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.892 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02f86d1d-51 in ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.894 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02f86d1d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.894 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[06ffd71e-a3a7-4ca8-b17f-ed815d8b429e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.894 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[10cf60b8-fdba-484a-a6aa-06a44107e5dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.906 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1958c6-da89-4dca-8739-164d25649d9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 systemd-machined[194361]: New machine qemu-16-instance-0000001e.
Jan 20 09:30:22 np0005588919 systemd[1]: Started Virtual Machine qemu-16-instance-0000001e.
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.931 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f347651b-ccca-4606-adfa-381baf9ac006]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.956 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e337afdc-c1dd-4aeb-ba92-16c2130442e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.966 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[730109c1-a20e-43e5-ade3-a5e3d9357c00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:22 np0005588919 NetworkManager[49104]: <info>  [1768919422.9675] manager: (tap02f86d1d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Jan 20 09:30:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:22.994 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c8eb4e-8e9b-4f0f-82b5-f74176e233a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.006 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[16ec4420-664b-4790-8bc5-6ae3e4b86fd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 NetworkManager[49104]: <info>  [1768919423.0299] device (tap02f86d1d-50): carrier: link connected
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.033 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3151c8eb-f1ef-44fc-b7a7-831cf9f7cae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.034 225859 DEBUG oslo_concurrency.processutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/35613d1c57984e108f84e93a2c7361cd.tmp" returned: 1 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.035 225859 DEBUG oslo_concurrency.processutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/35613d1c57984e108f84e93a2c7361cd.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.035 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Creating directory /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.035 225859 DEBUG oslo_concurrency.processutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:23Z|00116|binding|INFO|Releasing lport b3bfa880-f76c-4bab-98ca-24729b0d77e7 from this chassis (sb_readonly=0)
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.048 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e4d00a-4009-4e33-98d4-0d061a7bde45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02f86d1d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:08:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446195, 'reachable_time': 33326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240523, 'error': None, 'target': 'ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.053 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.062 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[920a26f1-b1f5-4954-b22a-72d877728b2c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:8de'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446195, 'tstamp': 446195}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240525, 'error': None, 'target': 'ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:23Z|00117|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 ovn-installed in OVS
Jan 20 09:30:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:23.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:23Z|00118|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 up in Southbound
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.065 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.079 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e774e6a6-1378-4583-afc7-5a8b9538c5aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02f86d1d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:08:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446195, 'reachable_time': 33326, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240526, 'error': None, 'target': 'ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[efd69c80-714f-4bae-974f-d540412ccf30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.159 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd3518d-e520-4628-844c-c7892f7d72b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.160 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f86d1d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.160 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.161 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02f86d1d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 NetworkManager[49104]: <info>  [1768919423.1637] manager: (tap02f86d1d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 20 09:30:23 np0005588919 kernel: tap02f86d1d-50: entered promiscuous mode
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.165 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.167 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02f86d1d-50, col_values=(('external_ids', {'iface-id': '2f798c1c-f9b6-4141-904d-4124d05888ca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.169 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:23Z|00119|binding|INFO|Releasing lport 2f798c1c-f9b6-4141-904d-4124d05888ca from this chassis (sb_readonly=0)
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.170 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02f86d1d-5cad-49c5-9004-3de3e4739ad5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02f86d1d-5cad-49c5-9004-3de3e4739ad5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.171 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9790b14e-2ef5-4a85-8725-433dcc4d80b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.173 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-02f86d1d-5cad-49c5-9004-3de3e4739ad5
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/02f86d1d-5cad-49c5-9004-3de3e4739ad5.pid.haproxy
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 02f86d1d-5cad-49c5-9004-3de3e4739ad5
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:30:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:23.174 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'env', 'PROCESS_TAG=haproxy-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02f86d1d-5cad-49c5-9004-3de3e4739ad5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.270 225859 DEBUG oslo_concurrency.processutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba" returned: 0 in 0.235s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.275 225859 DEBUG nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.414 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919423.4144607, f3faf996-e066-4b11-b7f3-30aeffff726e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.420 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.441 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.445 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919423.41456, f3faf996-e066-4b11-b7f3-30aeffff726e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.445 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.601 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:23 np0005588919 podman[240600]: 2026-01-20 14:30:23.507435139 +0000 UTC m=+0.028994196 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.605 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.634 225859 DEBUG nova.compute.manager [req-2ca9a36a-00d8-4ec1-afaa-0687a51a9397 req-50602aab-0cdb-49e4-9c52-2226624bc818 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.635 225859 DEBUG oslo_concurrency.lockutils [req-2ca9a36a-00d8-4ec1-afaa-0687a51a9397 req-50602aab-0cdb-49e4-9c52-2226624bc818 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.635 225859 DEBUG oslo_concurrency.lockutils [req-2ca9a36a-00d8-4ec1-afaa-0687a51a9397 req-50602aab-0cdb-49e4-9c52-2226624bc818 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.635 225859 DEBUG oslo_concurrency.lockutils [req-2ca9a36a-00d8-4ec1-afaa-0687a51a9397 req-50602aab-0cdb-49e4-9c52-2226624bc818 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.635 225859 DEBUG nova.compute.manager [req-2ca9a36a-00d8-4ec1-afaa-0687a51a9397 req-50602aab-0cdb-49e4-9c52-2226624bc818 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Processing event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.636 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.637 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.640 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.640 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919423.6407187, f3faf996-e066-4b11-b7f3-30aeffff726e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.641 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.644 225859 INFO nova.virt.libvirt.driver [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Instance spawned successfully.#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.644 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.671 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:23 np0005588919 podman[240600]: 2026-01-20 14:30:23.673448177 +0000 UTC m=+0.195007214 container create d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.688 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.691 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.691 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.692 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.692 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.693 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.693 225859 DEBUG nova.virt.libvirt.driver [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:30:23 np0005588919 systemd[1]: Started libpod-conmon-d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f.scope.
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.725 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:30:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:30:23 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c2b8ddd195f8cf3eab1b6717aa28a9f762bb41de1ab6eed44d1d21f47344a69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:30:23 np0005588919 podman[240600]: 2026-01-20 14:30:23.758057215 +0000 UTC m=+0.279616272 container init d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 09:30:23 np0005588919 podman[240600]: 2026-01-20 14:30:23.763729325 +0000 UTC m=+0.285288362 container start d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 09:30:23 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [NOTICE]   (240621) : New worker (240623) forked
Jan 20 09:30:23 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [NOTICE]   (240621) : Loading success.
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.782 225859 INFO nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Took 7.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.783 225859 DEBUG nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.843 225859 INFO nova.compute.manager [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Took 10.28 seconds to build instance.#033[00m
Jan 20 09:30:23 np0005588919 nova_compute[225855]: 2026-01-20 14:30:23.881 225859 DEBUG oslo_concurrency.lockutils [None req-ffb5ecbd-9a67-415f-a514-18a4e75e1236 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.384 225859 DEBUG nova.compute.manager [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.385 225859 DEBUG oslo_concurrency.lockutils [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.385 225859 DEBUG oslo_concurrency.lockutils [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.386 225859 DEBUG oslo_concurrency.lockutils [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.386 225859 DEBUG nova.compute.manager [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] No waiting events found dispatching network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:24 np0005588919 nova_compute[225855]: 2026-01-20 14:30:24.387 225859 WARNING nova.compute.manager [req-0300ed0f-b12a-41ea-a302-2b81b3e2e82f req-2da519f9-f123-43e5-8dfd-a2e15bcdfcb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received unexpected event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:30:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:24.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:25.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.712 225859 DEBUG nova.compute.manager [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.712 225859 DEBUG oslo_concurrency.lockutils [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.713 225859 DEBUG oslo_concurrency.lockutils [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.713 225859 DEBUG oslo_concurrency.lockutils [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.713 225859 DEBUG nova.compute.manager [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:25 np0005588919 nova_compute[225855]: 2026-01-20 14:30:25.713 225859 WARNING nova.compute.manager [req-80abcfc6-797e-4cf1-bd1a-136cccfc9592 req-65fb452a-b561-4f08-be4a-e8034101b770 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received unexpected event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:30:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:26.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:26 np0005588919 nova_compute[225855]: 2026-01-20 14:30:26.545 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:27.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:27 np0005588919 nova_compute[225855]: 2026-01-20 14:30:27.812 225859 DEBUG nova.compute.manager [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-changed-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:27 np0005588919 nova_compute[225855]: 2026-01-20 14:30:27.812 225859 DEBUG nova.compute.manager [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Refreshing instance network info cache due to event network-changed-f65050ac-6a44-490a-b4b9-8c82c1f61630. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:27 np0005588919 nova_compute[225855]: 2026-01-20 14:30:27.812 225859 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:27 np0005588919 nova_compute[225855]: 2026-01-20 14:30:27.813 225859 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:27 np0005588919 nova_compute[225855]: 2026-01-20 14:30:27.813 225859 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Refreshing network info cache for port f65050ac-6a44-490a-b4b9-8c82c1f61630 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:28.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:29 np0005588919 nova_compute[225855]: 2026-01-20 14:30:29.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:29.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:29 np0005588919 nova_compute[225855]: 2026-01-20 14:30:29.159 225859 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updated VIF entry in instance network info cache for port f65050ac-6a44-490a-b4b9-8c82c1f61630. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:29 np0005588919 nova_compute[225855]: 2026-01-20 14:30:29.160 225859 DEBUG nova.network.neutron [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating instance_info_cache with network_info: [{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:29 np0005588919 nova_compute[225855]: 2026-01-20 14:30:29.183 225859 DEBUG oslo_concurrency.lockutils [req-5feb12f5-f148-4dd7-8e9e-4c1e1b23a535 req-cebe04e4-5266-4ccb-a562-4d3cd542d204 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:30.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:31.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.334 225859 DEBUG nova.compute.manager [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-changed-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.336 225859 DEBUG nova.compute.manager [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing instance network info cache due to event network-changed-e848e00f-d594-4d70-9026-cd650417bf47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.336 225859 DEBUG oslo_concurrency.lockutils [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.337 225859 DEBUG oslo_concurrency.lockutils [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.337 225859 DEBUG nova.network.neutron [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing network info cache for port e848e00f-d594-4d70-9026-cd650417bf47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:31 np0005588919 nova_compute[225855]: 2026-01-20 14:30:31.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:32.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:32 np0005588919 nova_compute[225855]: 2026-01-20 14:30:32.407 225859 DEBUG nova.network.neutron [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updated VIF entry in instance network info cache for port e848e00f-d594-4d70-9026-cd650417bf47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:32 np0005588919 nova_compute[225855]: 2026-01-20 14:30:32.408 225859 DEBUG nova.network.neutron [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updating instance_info_cache with network_info: [{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:32 np0005588919 nova_compute[225855]: 2026-01-20 14:30:32.426 225859 DEBUG oslo_concurrency.lockutils [req-00c5853e-97df-4614-8365-d715abbc4c54 req-75d7488c-648d-4568-995b-79396b8ca661 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:33.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:33 np0005588919 nova_compute[225855]: 2026-01-20 14:30:33.315 225859 DEBUG nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:30:34 np0005588919 nova_compute[225855]: 2026-01-20 14:30:34.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:34 np0005588919 podman[240637]: 2026-01-20 14:30:34.068706119 +0000 UTC m=+0.104554601 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:30:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:34.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:35.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:35 np0005588919 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 20 09:30:35 np0005588919 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001d.scope: Consumed 13.256s CPU time.
Jan 20 09:30:35 np0005588919 systemd-machined[194361]: Machine qemu-14-instance-0000001d terminated.
Jan 20 09:30:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:36Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:9f:c4 10.100.0.5
Jan 20 09:30:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:36Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:9f:c4 10.100.0.5
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.328 225859 INFO nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.335 225859 INFO nova.virt.libvirt.driver [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance destroyed successfully.#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.340 225859 DEBUG nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.341 225859 DEBUG nova.virt.libvirt.driver [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:30:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:36.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.439 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Acquiring lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.440 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.440 225859 DEBUG oslo_concurrency.lockutils [None req-c9b86ee8-0957-4e69-a1fc-4cc0a2f42a3b d4b36d8e19cb4f529d2185f573f5072a 2074a786307f4427bbbbc1103d4a9305 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:36 np0005588919 nova_compute[225855]: 2026-01-20 14:30:36.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:37.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:38 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:38Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:ae:50 10.100.0.8
Jan 20 09:30:38 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:38Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:ae:50 10.100.0.8
Jan 20 09:30:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:38.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:38 np0005588919 nova_compute[225855]: 2026-01-20 14:30:38.651 225859 DEBUG nova.compute.manager [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-changed-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:38 np0005588919 nova_compute[225855]: 2026-01-20 14:30:38.652 225859 DEBUG nova.compute.manager [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing instance network info cache due to event network-changed-e848e00f-d594-4d70-9026-cd650417bf47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:30:38 np0005588919 nova_compute[225855]: 2026-01-20 14:30:38.652 225859 DEBUG oslo_concurrency.lockutils [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:38 np0005588919 nova_compute[225855]: 2026-01-20 14:30:38.653 225859 DEBUG oslo_concurrency.lockutils [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:38 np0005588919 nova_compute[225855]: 2026-01-20 14:30:38.653 225859 DEBUG nova.network.neutron [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Refreshing network info cache for port e848e00f-d594-4d70-9026-cd650417bf47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:30:39 np0005588919 nova_compute[225855]: 2026-01-20 14:30:39.028 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:39.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 20 09:30:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:40.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.808 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.809 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.809 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.810 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.810 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.812 225859 INFO nova.compute.manager [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Terminating instance#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.814 225859 DEBUG nova.compute.manager [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.865 225859 DEBUG nova.network.neutron [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updated VIF entry in instance network info cache for port e848e00f-d594-4d70-9026-cd650417bf47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.865 225859 DEBUG nova.network.neutron [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updating instance_info_cache with network_info: [{"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:40 np0005588919 kernel: tape848e00f-d5 (unregistering): left promiscuous mode
Jan 20 09:30:40 np0005588919 NetworkManager[49104]: <info>  [1768919440.8779] device (tape848e00f-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:30:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:40Z|00120|binding|INFO|Releasing lport e848e00f-d594-4d70-9026-cd650417bf47 from this chassis (sb_readonly=0)
Jan 20 09:30:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:40Z|00121|binding|INFO|Setting lport e848e00f-d594-4d70-9026-cd650417bf47 down in Southbound
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.884 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:40Z|00122|binding|INFO|Removing iface tape848e00f-d5 ovn-installed in OVS
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.888 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.893 225859 DEBUG oslo_concurrency.lockutils [req-6fbe3102-c26e-4376-bbf8-1f793a1760f9 req-446c2998-ba43-415f-a04c-3da6ba039f0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d08682f8-72ef-462c-b4b7-044cf16fc193" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:40.895 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:9f:c4 10.100.0.5'], port_security=['fa:16:3e:89:9f:c4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd08682f8-72ef-462c-b4b7-044cf16fc193', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e6deef-9aca-4d36-8215-4517982a86a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f151250c04467bb4f6a229dda16fc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dd36e8d2-993a-4618-8fff-62abafaadfd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86700b79-bb44-47f0-88a5-d4c8eda3acbb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e848e00f-d594-4d70-9026-cd650417bf47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:40.897 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e848e00f-d594-4d70-9026-cd650417bf47 in datapath 01e6deef-9aca-4d36-8215-4517982a86a3 unbound from our chassis#033[00m
Jan 20 09:30:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:40.900 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01e6deef-9aca-4d36-8215-4517982a86a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:30:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:40.902 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b9cc37aa-7132-46d0-84cc-d27cc5b5a9f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:40.903 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 namespace which is not needed anymore#033[00m
Jan 20 09:30:40 np0005588919 nova_compute[225855]: 2026-01-20 14:30:40.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:40 np0005588919 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Jan 20 09:30:40 np0005588919 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Consumed 13.474s CPU time.
Jan 20 09:30:40 np0005588919 systemd-machined[194361]: Machine qemu-15-instance-0000001f terminated.
Jan 20 09:30:40 np0005588919 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.054 225859 INFO nova.virt.libvirt.driver [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Instance destroyed successfully.#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.056 225859 DEBUG nova.objects.instance [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lazy-loading 'resources' on Instance uuid d08682f8-72ef-462c-b4b7-044cf16fc193 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.073 225859 DEBUG nova.virt.libvirt.vif [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-952138345',display_name='tempest-FloatingIPsAssociationTestJSON-server-952138345',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-952138345',id=31,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:30:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78f151250c04467bb4f6a229dda16fc5',ramdisk_id='',reservation_id='r-bkzlkdyv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-146254261',owner_user_name='tempest-FloatingIPsAssociationTestJSON-146254261-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:30:22Z,user_data=None,user_id='0cec872a00f742d78563d6d16fc545cb',uuid=d08682f8-72ef-462c-b4b7-044cf16fc193,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.074 225859 DEBUG nova.network.os_vif_util [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converting VIF {"id": "e848e00f-d594-4d70-9026-cd650417bf47", "address": "fa:16:3e:89:9f:c4", "network": {"id": "01e6deef-9aca-4d36-8215-4517982a86a3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-70171378-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f151250c04467bb4f6a229dda16fc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape848e00f-d5", "ovs_interfaceid": "e848e00f-d594-4d70-9026-cd650417bf47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.075 225859 DEBUG nova.network.os_vif_util [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.075 225859 DEBUG os_vif [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.076 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.077 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape848e00f-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.081 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.087 225859 INFO os_vif [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:9f:c4,bridge_name='br-int',has_traffic_filtering=True,id=e848e00f-d594-4d70-9026-cd650417bf47,network=Network(01e6deef-9aca-4d36-8215-4517982a86a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape848e00f-d5')#033[00m
Jan 20 09:30:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:41.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:41 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [NOTICE]   (240460) : haproxy version is 2.8.14-c23fe91
Jan 20 09:30:41 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [NOTICE]   (240460) : path to executable is /usr/sbin/haproxy
Jan 20 09:30:41 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [WARNING]  (240460) : Exiting Master process...
Jan 20 09:30:41 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [ALERT]    (240460) : Current worker (240463) exited with code 143 (Terminated)
Jan 20 09:30:41 np0005588919 neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3[240438]: [WARNING]  (240460) : All workers exited. Exiting... (0)
Jan 20 09:30:41 np0005588919 systemd[1]: libpod-5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6.scope: Deactivated successfully.
Jan 20 09:30:41 np0005588919 podman[240743]: 2026-01-20 14:30:41.104568905 +0000 UTC m=+0.071630205 container died 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:30:41 np0005588919 systemd[1]: var-lib-containers-storage-overlay-3c6a841c35edcc3f8a6f139ec1d95e7afdb44a50568f11f14c28989dc0834ec4-merged.mount: Deactivated successfully.
Jan 20 09:30:41 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6-userdata-shm.mount: Deactivated successfully.
Jan 20 09:30:41 np0005588919 podman[240743]: 2026-01-20 14:30:41.157706289 +0000 UTC m=+0.124767569 container cleanup 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:30:41 np0005588919 systemd[1]: libpod-conmon-5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6.scope: Deactivated successfully.
Jan 20 09:30:41 np0005588919 podman[240801]: 2026-01-20 14:30:41.233001266 +0000 UTC m=+0.052978861 container remove 5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.240 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7d81d470-1833-48db-838c-3f5fd3fcc975]: (4, ('Tue Jan 20 02:30:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 (5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6)\n5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6\nTue Jan 20 02:30:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 (5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6)\n5d6a44e3e459bb355cdf6f9a47a3cf4c054c368c8501bc071f9e3e640145abc6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.242 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ea43f60b-752e-4341-97ca-068215ae09f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.243 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e6deef-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.245 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:41 np0005588919 kernel: tap01e6deef-90: left promiscuous mode
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.266 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ffadfbec-5911-4589-9698-4bacd5f3306c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.282 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c9592f51-b119-443e-8df2-0707ac53e830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.283 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ee09ed13-b0e7-4d37-9d51-8531b76900e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.300 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[97026f3f-30ce-43ed-809c-3f6ccc8ba5dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446051, 'reachable_time': 23571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240816, 'error': None, 'target': 'ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.303 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01e6deef-9aca-4d36-8215-4517982a86a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:30:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:41.303 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[2afd49a1-396f-49f2-98c1-792f469e4345]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:30:41 np0005588919 systemd[1]: run-netns-ovnmeta\x2d01e6deef\x2d9aca\x2d4d36\x2d8215\x2d4517982a86a3.mount: Deactivated successfully.
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.497 225859 INFO nova.virt.libvirt.driver [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Deleting instance files /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193_del#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.498 225859 INFO nova.virt.libvirt.driver [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Deletion of /var/lib/nova/instances/d08682f8-72ef-462c-b4b7-044cf16fc193_del complete#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.578 225859 INFO nova.compute.manager [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.579 225859 DEBUG oslo.service.loopingcall [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.579 225859 DEBUG nova.compute.manager [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.579 225859 DEBUG nova.network.neutron [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.737 225859 DEBUG nova.compute.manager [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-unplugged-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.737 225859 DEBUG oslo_concurrency.lockutils [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.738 225859 DEBUG oslo_concurrency.lockutils [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.738 225859 DEBUG oslo_concurrency.lockutils [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.739 225859 DEBUG nova.compute.manager [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] No waiting events found dispatching network-vif-unplugged-e848e00f-d594-4d70-9026-cd650417bf47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:41 np0005588919 nova_compute[225855]: 2026-01-20 14:30:41.739 225859 DEBUG nova.compute.manager [req-069e3248-ffcc-4ef3-b5c9-20fa54841ad8 req-8559ef34-6398-4700-8212-25fa2d987a16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-unplugged-e848e00f-d594-4d70-9026-cd650417bf47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:30:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:42.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.445 225859 DEBUG nova.network.neutron [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.485 225859 INFO nova.compute.manager [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.558 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.559 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.637 225859 DEBUG nova.compute.manager [req-c15082bf-cb54-45d2-99f9-c1112a9f7ec7 req-8085d03c-b330-4294-a9dc-8420f2a28abb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-deleted-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:42 np0005588919 nova_compute[225855]: 2026-01-20 14:30:42.669 225859 DEBUG oslo_concurrency.processutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:43.049 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:43.051 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:30:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2733108454' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.084 225859 DEBUG oslo_concurrency.processutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.090 225859 DEBUG nova.compute.provider_tree [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:43.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.111 225859 DEBUG nova.scheduler.client.report [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.134 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.169 225859 INFO nova.scheduler.client.report [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Deleted allocations for instance d08682f8-72ef-462c-b4b7-044cf16fc193#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.252 225859 DEBUG oslo_concurrency.lockutils [None req-75591914-9108-47b2-a433-16634e89d62b 0cec872a00f742d78563d6d16fc545cb 78f151250c04467bb4f6a229dda16fc5 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.846 225859 DEBUG nova.compute.manager [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.846 225859 DEBUG oslo_concurrency.lockutils [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.847 225859 DEBUG oslo_concurrency.lockutils [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.848 225859 DEBUG oslo_concurrency.lockutils [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d08682f8-72ef-462c-b4b7-044cf16fc193-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.848 225859 DEBUG nova.compute.manager [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] No waiting events found dispatching network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:30:43 np0005588919 nova_compute[225855]: 2026-01-20 14:30:43.849 225859 WARNING nova.compute.manager [req-ee16a478-3fbe-470c-9254-d5c8b751d4d9 req-9c62e640-afde-4360-9fb0-56786d0cfbe8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Received unexpected event network-vif-plugged-e848e00f-d594-4d70-9026-cd650417bf47 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:44 np0005588919 podman[240842]: 2026-01-20 14:30:44.034682472 +0000 UTC m=+0.067238421 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.291 225859 INFO nova.compute.manager [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Swapping old allocation on dict_keys(['bbb02880-a710-4ac1-8b2c-5c09765848d1']) held by migration 60518ded-c4ce-45b7-a976-2c06150ab129 for instance#033[00m
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.375 225859 DEBUG nova.scheduler.client.report [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Overwriting current allocation {'allocations': {'068db7fd-4bd6-45a9-8bd6-a22cfe7596ed': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 24}}, 'project_id': 'f3c2e72a7148496394c8bcd618a19c80', 'user_id': '01a3d712f05049b19d4ecc7051720ad5', 'consumer_generation': 1} on consumer d95ca690-20e1-4b0c-919b-d64c9af25eba move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 20 09:30:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:44.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.606 225859 DEBUG oslo_concurrency.lockutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.607 225859 DEBUG oslo_concurrency.lockutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.607 225859 DEBUG nova.network.neutron [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:44 np0005588919 nova_compute[225855]: 2026-01-20 14:30:44.829 225859 DEBUG nova.network.neutron [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:45.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:45 np0005588919 nova_compute[225855]: 2026-01-20 14:30:45.138 225859 DEBUG nova.network.neutron [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:45 np0005588919 nova_compute[225855]: 2026-01-20 14:30:45.153 225859 DEBUG oslo_concurrency.lockutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:45 np0005588919 nova_compute[225855]: 2026-01-20 14:30:45.154 225859 DEBUG nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 20 09:30:45 np0005588919 nova_compute[225855]: 2026-01-20 14:30:45.248 225859 DEBUG nova.storage.rbd_utils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] rolling back rbd image(d95ca690-20e1-4b0c-919b-d64c9af25eba_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 20 09:30:45 np0005588919 nova_compute[225855]: 2026-01-20 14:30:45.374 225859 DEBUG nova.storage.rbd_utils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] removing snapshot(nova-resize) on rbd image(d95ca690-20e1-4b0c-919b-d64c9af25eba_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:46.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.941 225859 DEBUG nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.948 225859 WARNING nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.956 225859 DEBUG nova.virt.libvirt.host [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.957 225859 DEBUG nova.virt.libvirt.host [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.961 225859 DEBUG nova.virt.libvirt.host [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.962 225859 DEBUG nova.virt.libvirt.host [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.964 225859 DEBUG nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.965 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.966 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.966 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.967 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.967 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.968 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.969 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.969 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.970 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.971 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.971 225859 DEBUG nova.virt.hardware [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.972 225859 DEBUG nova.objects.instance [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d95ca690-20e1-4b0c-919b-d64c9af25eba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:46 np0005588919 nova_compute[225855]: 2026-01-20 14:30:46.992 225859 DEBUG oslo_concurrency.processutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:47.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3386445853' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:47 np0005588919 nova_compute[225855]: 2026-01-20 14:30:47.479 225859 DEBUG oslo_concurrency.processutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:47 np0005588919 nova_compute[225855]: 2026-01-20 14:30:47.510 225859 DEBUG oslo_concurrency.processutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:30:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/462858156' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:30:47 np0005588919 nova_compute[225855]: 2026-01-20 14:30:47.932 225859 DEBUG oslo_concurrency.processutils [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:47 np0005588919 nova_compute[225855]: 2026-01-20 14:30:47.937 225859 DEBUG nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <uuid>d95ca690-20e1-4b0c-919b-d64c9af25eba</uuid>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <name>instance-0000001d</name>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:name>tempest-MigrationsAdminTest-server-1542965426</nova:name>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:30:46</nova:creationTime>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:user uuid="01a3d712f05049b19d4ecc7051720ad5">tempest-MigrationsAdminTest-1518611738-project-member</nova:user>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <nova:project uuid="f3c2e72a7148496394c8bcd618a19c80">tempest-MigrationsAdminTest-1518611738</nova:project>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="serial">d95ca690-20e1-4b0c-919b-d64c9af25eba</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="uuid">d95ca690-20e1-4b0c-919b-d64c9af25eba</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d95ca690-20e1-4b0c-919b-d64c9af25eba_disk">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d95ca690-20e1-4b0c-919b-d64c9af25eba_disk.config">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba/console.log" append="off"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:30:47 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:30:47 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:30:47 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:30:47 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:30:48 np0005588919 systemd-machined[194361]: New machine qemu-17-instance-0000001d.
Jan 20 09:30:48 np0005588919 systemd[1]: Started Virtual Machine qemu-17-instance-0000001d.
Jan 20 09:30:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.625 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for d95ca690-20e1-4b0c-919b-d64c9af25eba due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.626 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919448.6250632, d95ca690-20e1-4b0c-919b-d64c9af25eba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.626 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.630 225859 DEBUG nova.compute.manager [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.636 225859 INFO nova.virt.libvirt.driver [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance running successfully.#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.637 225859 DEBUG nova.virt.libvirt.driver [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.646 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.651 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.674 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.674 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919448.6295831, d95ca690-20e1-4b0c-919b-d64c9af25eba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.675 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] VM Started (Lifecycle Event)#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.712 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.717 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.735 225859 INFO nova.compute.manager [None req-789e21ba-eed3-4ca4-aee2-f814206d5b1e 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Updating instance to original state: 'active'#033[00m
Jan 20 09:30:48 np0005588919 nova_compute[225855]: 2026-01-20 14:30:48.743 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 09:30:49 np0005588919 nova_compute[225855]: 2026-01-20 14:30:49.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:30:49.052 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:30:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:49.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.342 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.343 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.343 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.343 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.343 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.344 225859 INFO nova.compute.manager [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Terminating instance#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.345 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.345 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.345 225859 DEBUG nova.network.neutron [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:30:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:50.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:50 np0005588919 nova_compute[225855]: 2026-01-20 14:30:50.489 225859 DEBUG nova.network.neutron [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:51.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.136 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.461 225859 DEBUG nova.network.neutron [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.476 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-d95ca690-20e1-4b0c-919b-d64c9af25eba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.477 225859 DEBUG nova.compute.manager [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:30:51 np0005588919 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 20 09:30:51 np0005588919 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001d.scope: Consumed 3.616s CPU time.
Jan 20 09:30:51 np0005588919 systemd-machined[194361]: Machine qemu-17-instance-0000001d terminated.
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.698 225859 INFO nova.virt.libvirt.driver [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance destroyed successfully.#033[00m
Jan 20 09:30:51 np0005588919 nova_compute[225855]: 2026-01-20 14:30:51.699 225859 DEBUG nova.objects.instance [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid d95ca690-20e1-4b0c-919b-d64c9af25eba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:30:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:52.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.608 225859 INFO nova.virt.libvirt.driver [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Deleting instance files /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba_del#033[00m
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.609 225859 INFO nova.virt.libvirt.driver [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Deletion of /var/lib/nova/instances/d95ca690-20e1-4b0c-919b-d64c9af25eba_del complete#033[00m
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.691 225859 INFO nova.compute.manager [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.691 225859 DEBUG oslo.service.loopingcall [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.692 225859 DEBUG nova.compute.manager [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:30:52 np0005588919 nova_compute[225855]: 2026-01-20 14:30:52.692 225859 DEBUG nova.network.neutron [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.013 225859 DEBUG nova.network.neutron [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.031 225859 DEBUG nova.network.neutron [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.043 225859 INFO nova.compute.manager [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Took 0.35 seconds to deallocate network for instance.#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.084 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.084 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:30:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:30:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:53.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.206 225859 DEBUG oslo_concurrency.processutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:30:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:30:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1259635317' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.638 225859 DEBUG oslo_concurrency.processutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.643 225859 DEBUG nova.compute.provider_tree [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.670 225859 DEBUG nova.scheduler.client.report [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.697 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.733 225859 INFO nova.scheduler.client.report [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Deleted allocations for instance d95ca690-20e1-4b0c-919b-d64c9af25eba#033[00m
Jan 20 09:30:53 np0005588919 nova_compute[225855]: 2026-01-20 14:30:53.803 225859 DEBUG oslo_concurrency.lockutils [None req-b4645b9f-f5f3-448f-b29a-cb1a0fe3617b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "d95ca690-20e1-4b0c-919b-d64c9af25eba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:30:54 np0005588919 nova_compute[225855]: 2026-01-20 14:30:54.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:30:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:54.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:30:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:30:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:55.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:56 np0005588919 nova_compute[225855]: 2026-01-20 14:30:56.050 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919441.049494, d08682f8-72ef-462c-b4b7-044cf16fc193 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:30:56 np0005588919 nova_compute[225855]: 2026-01-20 14:30:56.051 225859 INFO nova.compute.manager [-] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:30:56 np0005588919 nova_compute[225855]: 2026-01-20 14:30:56.070 225859 DEBUG nova.compute.manager [None req-4108f74c-21b9-4dfb-8410-306005be1d76 - - - - - -] [instance: d08682f8-72ef-462c-b4b7-044cf16fc193] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:30:56 np0005588919 nova_compute[225855]: 2026-01-20 14:30:56.140 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:56.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 20 09:30:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:57.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:30:58.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:30:58Z|00123|binding|INFO|Releasing lport 2f798c1c-f9b6-4141-904d-4124d05888ca from this chassis (sb_readonly=0)
Jan 20 09:30:59 np0005588919 nova_compute[225855]: 2026-01-20 14:30:59.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:59 np0005588919 nova_compute[225855]: 2026-01-20 14:30:59.086 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:30:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:30:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:30:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:30:59.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:30:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:00.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:01.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:01 np0005588919 nova_compute[225855]: 2026-01-20 14:31:01.179 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:31:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3285058670' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:31:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:02.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:02 np0005588919 nova_compute[225855]: 2026-01-20 14:31:02.909 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:02 np0005588919 nova_compute[225855]: 2026-01-20 14:31:02.909 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:02 np0005588919 nova_compute[225855]: 2026-01-20 14:31:02.931 225859 DEBUG nova.objects.instance [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'flavor' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:03.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:03 np0005588919 nova_compute[225855]: 2026-01-20 14:31:03.360 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.025 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.025 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.025 225859 INFO nova.compute.manager [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Attaching volume 73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745 to /dev/vdb#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.308 225859 DEBUG os_brick.utils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.309 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.324 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.324 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[18502c70-0055-46c8-a72b-696d21294f22]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.326 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.335 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.336 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb344e0-43be-46e2-83b5-139ff61a28ed]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.338 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.348 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.348 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[297a0143-ee00-487b-beb5-3b6476bd7f00]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.350 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[aee7ee31-846f-44a4-babf-e372b36e4fd2]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.351 225859 DEBUG oslo_concurrency.processutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.380 225859 DEBUG oslo_concurrency.processutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.385 225859 DEBUG os_brick.initiator.connectors.lightos [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.386 225859 DEBUG os_brick.initiator.connectors.lightos [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.386 225859 DEBUG os_brick.initiator.connectors.lightos [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.387 225859 DEBUG os_brick.utils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] <== get_connector_properties: return (78ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:31:04 np0005588919 nova_compute[225855]: 2026-01-20 14:31:04.388 225859 DEBUG nova.virt.block_device [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating existing volume attachment record: 1b82c01d-d5c1-48cc-9d9f-078b75fe40c6 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:31:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:04.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:05 np0005588919 podman[241149]: 2026-01-20 14:31:05.105109255 +0000 UTC m=+0.132983888 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:31:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:05.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:05 np0005588919 nova_compute[225855]: 2026-01-20 14:31:05.850 225859 DEBUG nova.objects.instance [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'flavor' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:05 np0005588919 nova_compute[225855]: 2026-01-20 14:31:05.908 225859 DEBUG nova.virt.libvirt.driver [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Attempting to attach volume 73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:31:05 np0005588919 nova_compute[225855]: 2026-01-20 14:31:05.912 225859 DEBUG nova.virt.libvirt.guest [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745">
Jan 20 09:31:05 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:31:05 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <serial>73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745</serial>
Jan 20 09:31:05 np0005588919 nova_compute[225855]:  <shareable/>
Jan 20 09:31:05 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:31:05 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:31:06 np0005588919 nova_compute[225855]: 2026-01-20 14:31:06.183 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:06.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:06 np0005588919 nova_compute[225855]: 2026-01-20 14:31:06.698 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919451.6964307, d95ca690-20e1-4b0c-919b-d64c9af25eba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:06 np0005588919 nova_compute[225855]: 2026-01-20 14:31:06.698 225859 INFO nova.compute.manager [-] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:31:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:07.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.388 225859 DEBUG nova.compute.manager [None req-b95d732f-1d9c-4501-b569-5c03b1099505 - - - - - -] [instance: d95ca690-20e1-4b0c-919b-d64c9af25eba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.418 225859 DEBUG nova.virt.libvirt.driver [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.419 225859 DEBUG nova.virt.libvirt.driver [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.419 225859 DEBUG nova.virt.libvirt.driver [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.419 225859 DEBUG nova.virt.libvirt.driver [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] No VIF found with MAC fa:16:3e:fc:ae:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.708 225859 DEBUG oslo_concurrency.lockutils [None req-13fd3e8c-3da6-4d25-a48f-a47724a98b0e 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.912 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.912 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.913 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.913 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.913 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.914 225859 INFO nova.compute.manager [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Terminating instance#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.916 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.916 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquired lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:31:07 np0005588919 nova_compute[225855]: 2026-01-20 14:31:07.916 225859 DEBUG nova.network.neutron [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:31:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:08.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:08 np0005588919 nova_compute[225855]: 2026-01-20 14:31:08.617 225859 DEBUG nova.network.neutron [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.055 225859 DEBUG nova.network.neutron [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.078 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Releasing lock "refresh_cache-29f0b4d4-abf0-46e7-bf67-38e71eb42e28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.079 225859 DEBUG nova.compute.manager [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:09.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:09 np0005588919 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 20 09:31:09 np0005588919 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Consumed 19.693s CPU time.
Jan 20 09:31:09 np0005588919 systemd-machined[194361]: Machine qemu-10-instance-00000016 terminated.
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.299 225859 INFO nova.virt.libvirt.driver [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance destroyed successfully.#033[00m
Jan 20 09:31:09 np0005588919 nova_compute[225855]: 2026-01-20 14:31:09.299 225859 DEBUG nova.objects.instance [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lazy-loading 'resources' on Instance uuid 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:10.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:10 np0005588919 nova_compute[225855]: 2026-01-20 14:31:10.548 225859 INFO nova.virt.libvirt.driver [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Deleting instance files /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_del#033[00m
Jan 20 09:31:10 np0005588919 nova_compute[225855]: 2026-01-20 14:31:10.549 225859 INFO nova.virt.libvirt.driver [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Deletion of /var/lib/nova/instances/29f0b4d4-abf0-46e7-bf67-38e71eb42e28_del complete#033[00m
Jan 20 09:31:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:11.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:11 np0005588919 nova_compute[225855]: 2026-01-20 14:31:11.176 225859 INFO nova.compute.manager [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Took 2.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:31:11 np0005588919 nova_compute[225855]: 2026-01-20 14:31:11.177 225859 DEBUG oslo.service.loopingcall [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:31:11 np0005588919 nova_compute[225855]: 2026-01-20 14:31:11.177 225859 DEBUG nova.compute.manager [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:31:11 np0005588919 nova_compute[225855]: 2026-01-20 14:31:11.177 225859 DEBUG nova.network.neutron [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:31:11 np0005588919 nova_compute[225855]: 2026-01-20 14:31:11.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:12.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:13.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.282 225859 DEBUG nova.network.neutron [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.296 225859 DEBUG nova.network.neutron [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.311 225859 INFO nova.compute.manager [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Took 3.13 seconds to deallocate network for instance.#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.338 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.398 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.398 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:14.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.479 225859 DEBUG oslo_concurrency.processutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2038386253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.953 225859 DEBUG oslo_concurrency.processutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:14 np0005588919 nova_compute[225855]: 2026-01-20 14:31:14.962 225859 DEBUG nova.compute.provider_tree [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:15 np0005588919 nova_compute[225855]: 2026-01-20 14:31:15.011 225859 DEBUG nova.scheduler.client.report [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:15 np0005588919 podman[241242]: 2026-01-20 14:31:15.014175006 +0000 UTC m=+0.060500797 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:31:15 np0005588919 nova_compute[225855]: 2026-01-20 14:31:15.039 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:15 np0005588919 nova_compute[225855]: 2026-01-20 14:31:15.097 225859 INFO nova.scheduler.client.report [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Deleted allocations for instance 29f0b4d4-abf0-46e7-bf67-38e71eb42e28#033[00m
Jan 20 09:31:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:15.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:15 np0005588919 nova_compute[225855]: 2026-01-20 14:31:15.190 225859 DEBUG oslo_concurrency.lockutils [None req-78211a69-48e4-41e0-bc73-fcd33e4fcd7b 01a3d712f05049b19d4ecc7051720ad5 f3c2e72a7148496394c8bcd618a19c80 - - default default] Lock "29f0b4d4-abf0-46e7-bf67-38e71eb42e28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:16 np0005588919 nova_compute[225855]: 2026-01-20 14:31:16.191 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:16 np0005588919 nova_compute[225855]: 2026-01-20 14:31:16.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:16 np0005588919 nova_compute[225855]: 2026-01-20 14:31:16.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:31:16 np0005588919 nova_compute[225855]: 2026-01-20 14:31:16.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:16.390 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:16.391 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:16.391 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:16.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:17.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:31:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:18.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:31:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:19.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.180 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.545 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.546 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.546 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.547 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:19 np0005588919 nova_compute[225855]: 2026-01-20 14:31:19.652 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:20.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:21.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:21 np0005588919 nova_compute[225855]: 2026-01-20 14:31:21.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:21 np0005588919 nova_compute[225855]: 2026-01-20 14:31:21.344 225859 DEBUG oslo_concurrency.lockutils [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:21 np0005588919 nova_compute[225855]: 2026-01-20 14:31:21.345 225859 DEBUG oslo_concurrency.lockutils [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:21 np0005588919 nova_compute[225855]: 2026-01-20 14:31:21.403 225859 INFO nova.compute.manager [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Detaching volume 73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745#033[00m
Jan 20 09:31:21 np0005588919 podman[241487]: 2026-01-20 14:31:21.490971356 +0000 UTC m=+0.066430109 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 09:31:21 np0005588919 podman[241487]: 2026-01-20 14:31:21.588820682 +0000 UTC m=+0.164279455 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:31:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:21.595 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:21 np0005588919 nova_compute[225855]: 2026-01-20 14:31:21.596 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:21.596 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.002 225859 INFO nova.virt.block_device [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Attempting to driver detach volume 73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745 from mountpoint /dev/vdb#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.012 225859 DEBUG nova.virt.libvirt.driver [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Attempting to detach device vdb from instance f3faf996-e066-4b11-b7f3-30aeffff726e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.013 225859 DEBUG nova.virt.libvirt.guest [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745">
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <serial>73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745</serial>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <shareable/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:31:22 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.021 225859 INFO nova.virt.libvirt.driver [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Successfully detached device vdb from instance f3faf996-e066-4b11-b7f3-30aeffff726e from the persistent domain config.#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.021 225859 DEBUG nova.virt.libvirt.driver [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance f3faf996-e066-4b11-b7f3-30aeffff726e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.022 225859 DEBUG nova.virt.libvirt.guest [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745">
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <serial>73c5b3f0-c4ca-48f3-9dc2-d2c15d3fd745</serial>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <shareable/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:31:22 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:31:22 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.071 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768919482.0713122, f3faf996-e066-4b11-b7f3-30aeffff726e => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.073 225859 DEBUG nova.virt.libvirt.driver [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance f3faf996-e066-4b11-b7f3-30aeffff726e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:31:22 np0005588919 nova_compute[225855]: 2026-01-20 14:31:22.075 225859 INFO nova.virt.libvirt.driver [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Successfully detached device vdb from instance f3faf996-e066-4b11-b7f3-30aeffff726e from the live domain config.#033[00m
Jan 20 09:31:22 np0005588919 podman[241642]: 2026-01-20 14:31:22.213143151 +0000 UTC m=+0.050927995 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:31:22 np0005588919 podman[241642]: 2026-01-20 14:31:22.22226055 +0000 UTC m=+0.060045364 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:31:22 np0005588919 podman[241707]: 2026-01-20 14:31:22.397542315 +0000 UTC m=+0.048579840 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, version=2.2.4, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, release=1793)
Jan 20 09:31:22 np0005588919 podman[241707]: 2026-01-20 14:31:22.409012839 +0000 UTC m=+0.060050354 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, description=keepalived for Ceph, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, vcs-type=git, name=keepalived, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, release=1793)
Jan 20 09:31:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:22.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:23.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:23 np0005588919 nova_compute[225855]: 2026-01-20 14:31:23.320 225859 DEBUG nova.objects.instance [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'flavor' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:23 np0005588919 nova_compute[225855]: 2026-01-20 14:31:23.405 225859 DEBUG oslo_concurrency.lockutils [None req-8cd05146-8312-492a-a9d0-9949b7655b8a 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.297 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919469.2962034, 29f0b4d4-abf0-46e7-bf67-38e71eb42e28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.297 225859 INFO nova.compute.manager [-] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.323 225859 DEBUG nova.compute.manager [None req-96d6591b-4df2-49e1-97f5-a65cc1323c5a - - - - - -] [instance: 29f0b4d4-abf0-46e7-bf67-38e71eb42e28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:24.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.534 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating instance_info_cache with network_info: [{"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.566 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-f3faf996-e066-4b11-b7f3-30aeffff726e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.566 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.566 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.567 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.567 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.567 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.567 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.567 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.568 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.602 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.602 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.603 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.603 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:31:24 np0005588919 nova_compute[225855]: 2026-01-20 14:31:24.603 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:31:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:31:25 np0005588919 nova_compute[225855]: 2026-01-20 14:31:25.053 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:25.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:26 np0005588919 nova_compute[225855]: 2026-01-20 14:31:26.197 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:26.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:27.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:27 np0005588919 nova_compute[225855]: 2026-01-20 14:31:27.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:31:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:28.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:31:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:28.598 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:29.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.317 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.318 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.480 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.481 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4504MB free_disk=20.89706039428711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.481 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.481 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.664 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance f3faf996-e066-4b11-b7f3-30aeffff726e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.665 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.665 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:31:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:29 np0005588919 nova_compute[225855]: 2026-01-20 14:31:29.793 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1825073402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:30 np0005588919 nova_compute[225855]: 2026-01-20 14:31:30.233 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:30 np0005588919 nova_compute[225855]: 2026-01-20 14:31:30.241 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:30 np0005588919 nova_compute[225855]: 2026-01-20 14:31:30.261 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:30 np0005588919 nova_compute[225855]: 2026-01-20 14:31:30.326 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:31:30 np0005588919 nova_compute[225855]: 2026-01-20 14:31:30.327 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:30.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:31.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:31 np0005588919 nova_compute[225855]: 2026-01-20 14:31:31.201 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:31:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:31:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:32.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:31:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:34 np0005588919 nova_compute[225855]: 2026-01-20 14:31:34.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:34.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:35.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.322 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.531 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.532 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.532 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.532 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.532 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.534 225859 INFO nova.compute.manager [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Terminating instance#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.535 225859 DEBUG nova.compute.manager [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:31:35 np0005588919 kernel: tapf65050ac-6a (unregistering): left promiscuous mode
Jan 20 09:31:35 np0005588919 NetworkManager[49104]: <info>  [1768919495.7124] device (tapf65050ac-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.724 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00124|binding|INFO|Releasing lport f65050ac-6a44-490a-b4b9-8c82c1f61630 from this chassis (sb_readonly=0)
Jan 20 09:31:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00125|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 down in Southbound
Jan 20 09:31:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00126|binding|INFO|Removing iface tapf65050ac-6a ovn-installed in OVS
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.732 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:ae:50 10.100.0.8'], port_security=['fa:16:3e:fc:ae:50 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f3faf996-e066-4b11-b7f3-30aeffff726e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0cee74dd60da4a839bb5eb0ba3137edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e08f10e3-3a95-4e33-b03d-21860ea0dc91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f4fb07a-2698-4a11-a9e3-5a66d678d9d5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f65050ac-6a44-490a-b4b9-8c82c1f61630) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.733 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f65050ac-6a44-490a-b4b9-8c82c1f61630 in datapath 02f86d1d-5cad-49c5-9004-3de3e4739ad5 unbound from our chassis#033[00m
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.734 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02f86d1d-5cad-49c5-9004-3de3e4739ad5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.736 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42702bbe-6898-4272-a5a7-c3b8e8dd8ce9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.736 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5 namespace which is not needed anymore#033[00m
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:35 np0005588919 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 20 09:31:35 np0005588919 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001e.scope: Consumed 16.166s CPU time.
Jan 20 09:31:35 np0005588919 systemd-machined[194361]: Machine qemu-16-instance-0000001e terminated.
Jan 20 09:31:35 np0005588919 podman[241978]: 2026-01-20 14:31:35.848971605 +0000 UTC m=+0.098533206 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:31:35 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [NOTICE]   (240621) : haproxy version is 2.8.14-c23fe91
Jan 20 09:31:35 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [NOTICE]   (240621) : path to executable is /usr/sbin/haproxy
Jan 20 09:31:35 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [WARNING]  (240621) : Exiting Master process...
Jan 20 09:31:35 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [ALERT]    (240621) : Current worker (240623) exited with code 143 (Terminated)
Jan 20 09:31:35 np0005588919 neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5[240617]: [WARNING]  (240621) : All workers exited. Exiting... (0)
Jan 20 09:31:35 np0005588919 systemd[1]: libpod-d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f.scope: Deactivated successfully.
Jan 20 09:31:35 np0005588919 podman[242018]: 2026-01-20 14:31:35.90068208 +0000 UTC m=+0.071875527 container died d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:31:35 np0005588919 kernel: tapf65050ac-6a: entered promiscuous mode
Jan 20 09:31:35 np0005588919 systemd-udevd[241988]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:31:35 np0005588919 kernel: tapf65050ac-6a (unregistering): left promiscuous mode
Jan 20 09:31:35 np0005588919 NetworkManager[49104]: <info>  [1768919495.9563] manager: (tapf65050ac-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 20 09:31:35 np0005588919 nova_compute[225855]: 2026-01-20 14:31:35.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f-userdata-shm.mount: Deactivated successfully.
Jan 20 09:31:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00127|binding|INFO|Claiming lport f65050ac-6a44-490a-b4b9-8c82c1f61630 for this chassis.
Jan 20 09:31:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00128|binding|INFO|f65050ac-6a44-490a-b4b9-8c82c1f61630: Claiming fa:16:3e:fc:ae:50 10.100.0.8
Jan 20 09:31:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay-7c2b8ddd195f8cf3eab1b6717aa28a9f762bb41de1ab6eed44d1d21f47344a69-merged.mount: Deactivated successfully.
Jan 20 09:31:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:35.971 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:ae:50 10.100.0.8'], port_security=['fa:16:3e:fc:ae:50 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f3faf996-e066-4b11-b7f3-30aeffff726e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0cee74dd60da4a839bb5eb0ba3137edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e08f10e3-3a95-4e33-b03d-21860ea0dc91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f4fb07a-2698-4a11-a9e3-5a66d678d9d5, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f65050ac-6a44-490a-b4b9-8c82c1f61630) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:35 np0005588919 podman[242018]: 2026-01-20 14:31:35.999244506 +0000 UTC m=+0.170437993 container cleanup d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00129|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 ovn-installed in OVS
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:35Z|00130|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 up in Southbound
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:36Z|00131|binding|INFO|Releasing lport f65050ac-6a44-490a-b4b9-8c82c1f61630 from this chassis (sb_readonly=1)
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:36Z|00132|binding|INFO|Removing iface tapf65050ac-6a ovn-installed in OVS
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:36Z|00133|if_status|INFO|Not setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 down as sb is readonly
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.002 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.003 225859 INFO nova.virt.libvirt.driver [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Instance destroyed successfully.#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.003 225859 DEBUG nova.objects.instance [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lazy-loading 'resources' on Instance uuid f3faf996-e066-4b11-b7f3-30aeffff726e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:36Z|00134|binding|INFO|Releasing lport f65050ac-6a44-490a-b4b9-8c82c1f61630 from this chassis (sb_readonly=0)
Jan 20 09:31:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:31:36Z|00135|binding|INFO|Setting lport f65050ac-6a44-490a-b4b9-8c82c1f61630 down in Southbound
Jan 20 09:31:36 np0005588919 systemd[1]: libpod-conmon-d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f.scope: Deactivated successfully.
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.017 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:ae:50 10.100.0.8'], port_security=['fa:16:3e:fc:ae:50 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f3faf996-e066-4b11-b7f3-30aeffff726e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0cee74dd60da4a839bb5eb0ba3137edf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e08f10e3-3a95-4e33-b03d-21860ea0dc91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f4fb07a-2698-4a11-a9e3-5a66d678d9d5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f65050ac-6a44-490a-b4b9-8c82c1f61630) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.030 225859 DEBUG nova.virt.libvirt.vif [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1191836092',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1191836092',id=30,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+01n3DJe3yYfRmwifZEomZrLtaFilErLasmr7ze/p0n1d6nPaSWQOHrHfJ9ubgBCwoqlwHjFIWrKKyRcRI1f3OIubHCG4LO7UMySAzmCXBSDkLJPz6Qzoln3dTb/xrow==',key_name='tempest-keypair-696534507',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:30:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0cee74dd60da4a839bb5eb0ba3137edf',ramdisk_id='',reservation_id='r-0tyxczv3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-859917658-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:30:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a3fbc3f92a849e88cbf34d28ca17e43',uuid=f3faf996-e066-4b11-b7f3-30aeffff726e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.030 225859 DEBUG nova.network.os_vif_util [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converting VIF {"id": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "address": "fa:16:3e:fc:ae:50", "network": {"id": "02f86d1d-5cad-49c5-9004-3de3e4739ad5", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-889517255-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0cee74dd60da4a839bb5eb0ba3137edf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65050ac-6a", "ovs_interfaceid": "f65050ac-6a44-490a-b4b9-8c82c1f61630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.031 225859 DEBUG nova.network.os_vif_util [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.031 225859 DEBUG os_vif [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.033 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.034 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf65050ac-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.037 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.040 225859 INFO os_vif [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:ae:50,bridge_name='br-int',has_traffic_filtering=True,id=f65050ac-6a44-490a-b4b9-8c82c1f61630,network=Network(02f86d1d-5cad-49c5-9004-3de3e4739ad5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65050ac-6a')#033[00m
Jan 20 09:31:36 np0005588919 podman[242060]: 2026-01-20 14:31:36.077156497 +0000 UTC m=+0.052868457 container remove d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.084 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ccad529c-f852-4a1e-8b17-2ca942213b4f]: (4, ('Tue Jan 20 02:31:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5 (d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f)\nd8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f\nTue Jan 20 02:31:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5 (d8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f)\nd8a0ee9d6ac43a9bc3bd7cb9d79f6a436d6ee7fbfe9aa9472ca3997bc110a44f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.085 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f315b231-c941-4785-a542-26a8d6d0abd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.087 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f86d1d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 kernel: tap02f86d1d-50: left promiscuous mode
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.110 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.111 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.117 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f49f1d71-0929-4592-bc0d-802201e64cc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.138 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c284b254-7c6c-427a-a58c-75316f3b6e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.139 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d84458f-d193-4d6d-9c00-406be7e4c61e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.151 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78cd529a-5f8c-448a-b5f4-bf99173fa543]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446187, 'reachable_time': 33231, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242093, 'error': None, 'target': 'ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.154 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02f86d1d-5cad-49c5-9004-3de3e4739ad5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.154 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[c053d965-568e-4f47-bc28-12a2348037a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 systemd[1]: run-netns-ovnmeta\x2d02f86d1d\x2d5cad\x2d49c5\x2d9004\x2d3de3e4739ad5.mount: Deactivated successfully.
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.155 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f65050ac-6a44-490a-b4b9-8c82c1f61630 in datapath 02f86d1d-5cad-49c5-9004-3de3e4739ad5 unbound from our chassis#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.156 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02f86d1d-5cad-49c5-9004-3de3e4739ad5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.157 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[31c45d06-2b7b-42c3-838c-06ef399a19a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.157 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f65050ac-6a44-490a-b4b9-8c82c1f61630 in datapath 02f86d1d-5cad-49c5-9004-3de3e4739ad5 unbound from our chassis#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.158 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02f86d1d-5cad-49c5-9004-3de3e4739ad5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:31:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:31:36.159 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c39594e-3d1c-4297-9a6b-592a711d2187]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:31:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:36.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.470 225859 INFO nova.virt.libvirt.driver [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Deleting instance files /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e_del#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.471 225859 INFO nova.virt.libvirt.driver [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Deletion of /var/lib/nova/instances/f3faf996-e066-4b11-b7f3-30aeffff726e_del complete#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.568 225859 INFO nova.compute.manager [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Took 1.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.569 225859 DEBUG oslo.service.loopingcall [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.569 225859 DEBUG nova.compute.manager [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:31:36 np0005588919 nova_compute[225855]: 2026-01-20 14:31:36.569 225859 DEBUG nova.network.neutron [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.054 225859 DEBUG nova.compute.manager [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-unplugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.055 225859 DEBUG oslo_concurrency.lockutils [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.055 225859 DEBUG oslo_concurrency.lockutils [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.056 225859 DEBUG oslo_concurrency.lockutils [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.056 225859 DEBUG nova.compute.manager [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-unplugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:37 np0005588919 nova_compute[225855]: 2026-01-20 14:31:37.057 225859 DEBUG nova.compute.manager [req-3f9e0a72-0907-4a4e-94f7-7230b01add89 req-20cf6565-ffc2-4cf4-9147-7eae857ad506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-unplugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:31:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:37.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:39 np0005588919 nova_compute[225855]: 2026-01-20 14:31:39.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:40.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.037 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.700 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.701 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.701 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.702 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.702 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.702 225859 WARNING nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received unexpected event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.703 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.704 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.704 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.704 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.705 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.705 225859 WARNING nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received unexpected event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.706 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.706 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.706 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.707 225859 DEBUG oslo_concurrency.lockutils [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.707 225859 DEBUG nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:41 np0005588919 nova_compute[225855]: 2026-01-20 14:31:41.708 225859 WARNING nova.compute.manager [req-3aa8789f-02ae-4d94-a233-8326f12f7be4 req-21a02727-99fe-4332-89ac-e6fc0270f9a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received unexpected event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:31:42 np0005588919 nova_compute[225855]: 2026-01-20 14:31:42.312 225859 DEBUG nova.network.neutron [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:31:42 np0005588919 nova_compute[225855]: 2026-01-20 14:31:42.403 225859 INFO nova.compute.manager [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Took 5.83 seconds to deallocate network for instance.#033[00m
Jan 20 09:31:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:31:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:42.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:31:42 np0005588919 nova_compute[225855]: 2026-01-20 14:31:42.510 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:42 np0005588919 nova_compute[225855]: 2026-01-20 14:31:42.510 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:42 np0005588919 nova_compute[225855]: 2026-01-20 14:31:42.642 225859 DEBUG oslo_concurrency.processutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:31:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:31:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2619803401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.055 225859 DEBUG oslo_concurrency.processutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.061 225859 DEBUG nova.compute.provider_tree [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:31:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:43.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.210 225859 DEBUG nova.compute.manager [req-665a852b-45e2-41dc-baa5-6dac3582993a req-1bd26dbc-f3f6-456e-9725-9c489e4d6dbc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-deleted-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.247 225859 DEBUG nova.scheduler.client.report [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.295 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.364 225859 INFO nova.scheduler.client.report [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Deleted allocations for instance f3faf996-e066-4b11-b7f3-30aeffff726e#033[00m
Jan 20 09:31:43 np0005588919 nova_compute[225855]: 2026-01-20 14:31:43.514 225859 DEBUG oslo_concurrency.lockutils [None req-cc85b8e0-128a-4116-8005-9a2d9a2e89ef 6a3fbc3f92a849e88cbf34d28ca17e43 0cee74dd60da4a839bb5eb0ba3137edf - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.038 225859 DEBUG nova.compute.manager [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.038 225859 DEBUG oslo_concurrency.lockutils [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.039 225859 DEBUG oslo_concurrency.lockutils [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.039 225859 DEBUG oslo_concurrency.lockutils [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f3faf996-e066-4b11-b7f3-30aeffff726e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.039 225859 DEBUG nova.compute.manager [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] No waiting events found dispatching network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.039 225859 WARNING nova.compute.manager [req-ece0dc98-2deb-472e-a234-eb2b47d208cb req-f01e7aaf-914b-4d30-a66d-65fb4aa231ad 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Received unexpected event network-vif-plugged-f65050ac-6a44-490a-b4b9-8c82c1f61630 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:31:44 np0005588919 nova_compute[225855]: 2026-01-20 14:31:44.335 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:44.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:45.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:46 np0005588919 podman[242172]: 2026-01-20 14:31:46.019617272 +0000 UTC m=+0.061332599 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:31:46 np0005588919 nova_compute[225855]: 2026-01-20 14:31:46.040 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:46.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:47.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:48.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:49 np0005588919 nova_compute[225855]: 2026-01-20 14:31:49.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:50.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:51 np0005588919 nova_compute[225855]: 2026-01-20 14:31:51.000 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919495.9989328, f3faf996-e066-4b11-b7f3-30aeffff726e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:31:51 np0005588919 nova_compute[225855]: 2026-01-20 14:31:51.001 225859 INFO nova.compute.manager [-] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:31:51 np0005588919 nova_compute[225855]: 2026-01-20 14:31:51.035 225859 DEBUG nova.compute.manager [None req-86a42ba4-7e30-43e8-8bed-b6a73b5db3aa - - - - - -] [instance: f3faf996-e066-4b11-b7f3-30aeffff726e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:31:51 np0005588919 nova_compute[225855]: 2026-01-20 14:31:51.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:51.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:52.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:53.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:54 np0005588919 nova_compute[225855]: 2026-01-20 14:31:54.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:54.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:31:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:55.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:56 np0005588919 nova_compute[225855]: 2026-01-20 14:31:56.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:56.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:57.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:58 np0005588919 nova_compute[225855]: 2026-01-20 14:31:58.313 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:58 np0005588919 nova_compute[225855]: 2026-01-20 14:31:58.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:31:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:31:58.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:31:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:31:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:31:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:31:59.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:31:59 np0005588919 nova_compute[225855]: 2026-01-20 14:31:59.342 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:31:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:00.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:01 np0005588919 nova_compute[225855]: 2026-01-20 14:32:01.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:01.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:01.784 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:32:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:01.784 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:32:01 np0005588919 nova_compute[225855]: 2026-01-20 14:32:01.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:02.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:03.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:04 np0005588919 nova_compute[225855]: 2026-01-20 14:32:04.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:04.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:05.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:06 np0005588919 podman[242254]: 2026-01-20 14:32:06.037888417 +0000 UTC m=+0.089063247 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:32:06 np0005588919 nova_compute[225855]: 2026-01-20 14:32:06.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:06.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:07.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:08.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:09.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:09 np0005588919 nova_compute[225855]: 2026-01-20 14:32:09.346 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:09.786 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:10.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:11 np0005588919 nova_compute[225855]: 2026-01-20 14:32:11.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:11.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.033 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "561d1914-3348-438c-84ba-1205f479c245" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.034 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.053 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.200 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.201 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.211 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.211 225859 INFO nova.compute.claims [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:32:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:12.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:12 np0005588919 nova_compute[225855]: 2026-01-20 14:32:12.622 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2127417258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.065 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.070 225859 DEBUG nova.compute.provider_tree [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.087 225859 DEBUG nova.scheduler.client.report [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.113 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.115 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.180 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.181 225859 DEBUG nova.network.neutron [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:32:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:13.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.249 225859 INFO nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.269 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.410 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.411 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.411 225859 INFO nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Creating image(s)#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.441 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.469 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.497 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.501 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.598 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.600 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.601 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.602 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.641 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.646 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 561d1914-3348-438c-84ba-1205f479c245_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.705 225859 DEBUG nova.network.neutron [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:32:13 np0005588919 nova_compute[225855]: 2026-01-20 14:32:13.706 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.030 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 561d1914-3348-438c-84ba-1205f479c245_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.091 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] resizing rbd image 561d1914-3348-438c-84ba-1205f479c245_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.184 225859 DEBUG nova.objects.instance [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lazy-loading 'migration_context' on Instance uuid 561d1914-3348-438c-84ba-1205f479c245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.202 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.203 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Ensure instance console log exists: /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.203 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.203 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.204 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.206 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.210 225859 WARNING nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.215 225859 DEBUG nova.virt.libvirt.host [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.216 225859 DEBUG nova.virt.libvirt.host [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.220 225859 DEBUG nova.virt.libvirt.host [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.220 225859 DEBUG nova.virt.libvirt.host [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.221 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.221 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.222 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.222 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.222 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.223 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.223 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.223 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.223 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.224 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.224 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.224 225859 DEBUG nova.virt.hardware [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.227 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.347 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:14.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/651665474' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.671 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.710 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:14 np0005588919 nova_compute[225855]: 2026-01-20 14:32:14.715 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1619627399' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.145 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.147 225859 DEBUG nova.objects.instance [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 561d1914-3348-438c-84ba-1205f479c245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.172 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <uuid>561d1914-3348-438c-84ba-1205f479c245</uuid>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <name>instance-00000024</name>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-607115172</nova:name>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:32:14</nova:creationTime>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:user uuid="a76f8a7bf01145bf8c953695d87aed2a">tempest-ServerDiagnosticsNegativeTest-1836592071-project-member</nova:user>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <nova:project uuid="16c25c6af46845c8b8f7beaa0a50bd38">tempest-ServerDiagnosticsNegativeTest-1836592071</nova:project>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="serial">561d1914-3348-438c-84ba-1205f479c245</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="uuid">561d1914-3348-438c-84ba-1205f479c245</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/561d1914-3348-438c-84ba-1205f479c245_disk">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/561d1914-3348-438c-84ba-1205f479c245_disk.config">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/console.log" append="off"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:32:15 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:32:15 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:32:15 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:32:15 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:32:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:15.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.234 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.235 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.235 225859 INFO nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Using config drive#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.262 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.338 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.772 225859 INFO nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Creating config drive at /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.779 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa2kmc1d9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.912 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa2kmc1d9" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.947 225859 DEBUG nova.storage.rbd_utils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] rbd image 561d1914-3348-438c-84ba-1205f479c245_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:15 np0005588919 nova_compute[225855]: 2026-01-20 14:32:15.952 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config 561d1914-3348-438c-84ba-1205f479c245_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.093 225859 DEBUG oslo_concurrency.processutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config 561d1914-3348-438c-84ba-1205f479c245_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.093 225859 INFO nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Deleting local config drive /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245/disk.config because it was imported into RBD.#033[00m
Jan 20 09:32:16 np0005588919 systemd-machined[194361]: New machine qemu-18-instance-00000024.
Jan 20 09:32:16 np0005588919 systemd[1]: Started Virtual Machine qemu-18-instance-00000024.
Jan 20 09:32:16 np0005588919 podman[242603]: 2026-01-20 14:32:16.21865295 +0000 UTC m=+0.054534953 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:16.391 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:16.392 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:16.392 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:16.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.655 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919536.6547425, 561d1914-3348-438c-84ba-1205f479c245 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.656 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.658 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.658 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.662 225859 INFO nova.virt.libvirt.driver [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance spawned successfully.#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.663 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.685 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.691 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.694 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.694 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.694 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.695 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.695 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.696 225859 DEBUG nova.virt.libvirt.driver [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.723 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.724 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919536.6574569, 561d1914-3348-438c-84ba-1205f479c245 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.724 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] VM Started (Lifecycle Event)#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.751 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.755 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.785 225859 INFO nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Took 3.38 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.786 225859 DEBUG nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.794 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.919 225859 INFO nova.compute.manager [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Took 4.77 seconds to build instance.#033[00m
Jan 20 09:32:16 np0005588919 nova_compute[225855]: 2026-01-20 14:32:16.956 225859 DEBUG oslo_concurrency.lockutils [None req-5a15f14b-d2e2-44fb-839f-cc33c32dc1af a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:17.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.639 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.639 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.640 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.640 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 561d1914-3348-438c-84ba-1205f479c245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.730 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "561d1914-3348-438c-84ba-1205f479c245" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.731 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.731 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "561d1914-3348-438c-84ba-1205f479c245-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.731 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.732 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.733 225859 INFO nova.compute.manager [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Terminating instance#033[00m
Jan 20 09:32:17 np0005588919 nova_compute[225855]: 2026-01-20 14:32:17.734 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.004 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:18.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.968 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.996 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.997 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.997 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquired lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:18 np0005588919 nova_compute[225855]: 2026-01-20 14:32:18.998 225859 DEBUG nova.network.neutron [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.000 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.000 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:32:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:19.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.349 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:19 np0005588919 nova_compute[225855]: 2026-01-20 14:32:19.771 225859 DEBUG nova.network.neutron [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.369 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.370 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:20.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.762 225859 DEBUG nova.network.neutron [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.801 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Releasing lock "refresh_cache-561d1914-3348-438c-84ba-1205f479c245" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:20 np0005588919 nova_compute[225855]: 2026-01-20 14:32:20.802 225859 DEBUG nova.compute.manager [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:32:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1361981238' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.001 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.632s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:21 np0005588919 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 20 09:32:21 np0005588919 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Consumed 4.845s CPU time.
Jan 20 09:32:21 np0005588919 systemd-machined[194361]: Machine qemu-18-instance-00000024 terminated.
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.224 225859 INFO nova.virt.libvirt.driver [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance destroyed successfully.#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.225 225859 DEBUG nova.objects.instance [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lazy-loading 'resources' on Instance uuid 561d1914-3348-438c-84ba-1205f479c245 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:21.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.352 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.352 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.546 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.548 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4706MB free_disk=20.9010009765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.548 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.548 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.679 225859 INFO nova.virt.libvirt.driver [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Deleting instance files /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245_del#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.680 225859 INFO nova.virt.libvirt.driver [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Deletion of /var/lib/nova/instances/561d1914-3348-438c-84ba-1205f479c245_del complete#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.686 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 561d1914-3348-438c-84ba-1205f479c245 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.687 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.687 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.730 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.764 225859 INFO nova.compute.manager [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] [instance: 561d1914-3348-438c-84ba-1205f479c245] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.765 225859 DEBUG oslo.service.loopingcall [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.766 225859 DEBUG nova.compute.manager [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.766 225859 DEBUG nova.network.neutron [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.971 225859 DEBUG nova.network.neutron [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:21 np0005588919 nova_compute[225855]: 2026-01-20 14:32:21.989 225859 DEBUG nova.network.neutron [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.003 225859 INFO nova.compute.manager [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] Took 0.24 seconds to deallocate network for instance.#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.049 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3324155321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.236 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.242 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.271 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.295 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.295 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.296 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.336 225859 DEBUG oslo_concurrency.processutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:22.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2670042754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.812 225859 DEBUG oslo_concurrency.processutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.819 225859 DEBUG nova.compute.provider_tree [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.835 225859 DEBUG nova.scheduler.client.report [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.863 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.907 225859 INFO nova.scheduler.client.report [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Deleted allocations for instance 561d1914-3348-438c-84ba-1205f479c245#033[00m
Jan 20 09:32:22 np0005588919 nova_compute[225855]: 2026-01-20 14:32:22.987 225859 DEBUG oslo_concurrency.lockutils [None req-15eb9962-6ec7-4032-be04-a45efdd89a5d a76f8a7bf01145bf8c953695d87aed2a 16c25c6af46845c8b8f7beaa0a50bd38 - - default default] Lock "561d1914-3348-438c-84ba-1205f479c245" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:23.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:23 np0005588919 nova_compute[225855]: 2026-01-20 14:32:23.296 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:23 np0005588919 nova_compute[225855]: 2026-01-20 14:32:23.297 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:23 np0005588919 nova_compute[225855]: 2026-01-20 14:32:23.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:24 np0005588919 nova_compute[225855]: 2026-01-20 14:32:24.350 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:24.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:25.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:25 np0005588919 nova_compute[225855]: 2026-01-20 14:32:25.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:32:26 np0005588919 nova_compute[225855]: 2026-01-20 14:32:26.064 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:26.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:27.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:28.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:29.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:29 np0005588919 nova_compute[225855]: 2026-01-20 14:32:29.353 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:30.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:31 np0005588919 nova_compute[225855]: 2026-01-20 14:32:31.067 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:31.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:32.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:32:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:32:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:33.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:34 np0005588919 nova_compute[225855]: 2026-01-20 14:32:34.396 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:34.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:35.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.070 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.221 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919541.220913, 561d1914-3348-438c-84ba-1205f479c245 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.222 225859 INFO nova.compute.manager [-] [instance: 561d1914-3348-438c-84ba-1205f479c245] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.256 225859 DEBUG nova.compute.manager [None req-a0129269-2c75-4ad8-8d4f-eec756c10b21 - - - - - -] [instance: 561d1914-3348-438c-84ba-1205f479c245] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:36 np0005588919 podman[242955]: 2026-01-20 14:32:36.332899141 +0000 UTC m=+0.078173909 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:32:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:36.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.758 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "965c89d3-c13e-442b-8d32-f351bf95dda5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.758 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.780 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.863 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.863 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.869 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:32:36 np0005588919 nova_compute[225855]: 2026-01-20 14:32:36.869 225859 INFO nova.compute.claims [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.022 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:37.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3125161546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.439 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.445 225859 DEBUG nova.compute.provider_tree [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.488 225859 DEBUG nova.scheduler.client.report [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.519 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.520 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.639 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.640 225859 DEBUG nova.network.neutron [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.674 225859 INFO nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.819 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:32:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.985 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.986 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:32:37 np0005588919 nova_compute[225855]: 2026-01-20 14:32:37.987 225859 INFO nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Creating image(s)#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.014 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.048 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.076 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.080 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.139 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.140 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.140 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.141 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.168 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.171 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 965c89d3-c13e-442b-8d32-f351bf95dda5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:38 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.462 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 965c89d3-c13e-442b-8d32-f351bf95dda5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.290s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.518 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] resizing rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:32:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:38.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.603 225859 DEBUG nova.objects.instance [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lazy-loading 'migration_context' on Instance uuid 965c89d3-c13e-442b-8d32-f351bf95dda5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.768 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.769 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Ensure instance console log exists: /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.769 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.769 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.770 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.786 225859 DEBUG nova.network.neutron [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.787 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.788 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.791 225859 WARNING nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.794 225859 DEBUG nova.virt.libvirt.host [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.795 225859 DEBUG nova.virt.libvirt.host [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.799 225859 DEBUG nova.virt.libvirt.host [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.799 225859 DEBUG nova.virt.libvirt.host [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.800 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.800 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.801 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.801 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.801 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.801 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.801 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.802 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.802 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.802 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.802 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.802 225859 DEBUG nova.virt.hardware [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:32:38 np0005588919 nova_compute[225855]: 2026-01-20 14:32:38.805 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/151885523' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.197 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.219 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.224 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:39.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.396 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:32:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4006511380' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.634 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:39 np0005588919 nova_compute[225855]: 2026-01-20 14:32:39.636 225859 DEBUG nova.objects.instance [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lazy-loading 'pci_devices' on Instance uuid 965c89d3-c13e-442b-8d32-f351bf95dda5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.239 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <uuid>965c89d3-c13e-442b-8d32-f351bf95dda5</uuid>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <name>instance-00000025</name>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerExternalEventsTest-server-1702471022</nova:name>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:32:38</nova:creationTime>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:user uuid="360feb9a3f0146f0b84b6c28241e41a9">tempest-ServerExternalEventsTest-1124989951-project-member</nova:user>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <nova:project uuid="bd830773c45f46e3b6fd28d50255c383">tempest-ServerExternalEventsTest-1124989951</nova:project>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="serial">965c89d3-c13e-442b-8d32-f351bf95dda5</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="uuid">965c89d3-c13e-442b-8d32-f351bf95dda5</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/965c89d3-c13e-442b-8d32-f351bf95dda5_disk">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/console.log" append="off"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:32:40 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:32:40 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:32:40 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:32:40 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.358 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.358 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.359 225859 INFO nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Using config drive#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.381 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:40.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.571 225859 INFO nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Creating config drive at /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.577 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph84qe834 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.701 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph84qe834" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.725 225859 DEBUG nova.storage.rbd_utils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] rbd image 965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.728 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config 965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.894 225859 DEBUG oslo_concurrency.processutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config 965c89d3-c13e-442b-8d32-f351bf95dda5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:40 np0005588919 nova_compute[225855]: 2026-01-20 14:32:40.895 225859 INFO nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Deleting local config drive /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5/disk.config because it was imported into RBD.#033[00m
Jan 20 09:32:40 np0005588919 systemd-machined[194361]: New machine qemu-19-instance-00000025.
Jan 20 09:32:40 np0005588919 systemd[1]: Started Virtual Machine qemu-19-instance-00000025.
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.073 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:41.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.353 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919561.3523674, 965c89d3-c13e-442b-8d32-f351bf95dda5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.353 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.358 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.358 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.363 225859 INFO nova.virt.libvirt.driver [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance spawned successfully.#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.364 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.385 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.392 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.393 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.394 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.394 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.395 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.395 225859 DEBUG nova.virt.libvirt.driver [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.402 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.451 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.452 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919561.3558192, 965c89d3-c13e-442b-8d32-f351bf95dda5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.452 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] VM Started (Lifecycle Event)#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.474 225859 INFO nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Took 3.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.475 225859 DEBUG nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.477 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.486 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.514 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.552 225859 INFO nova.compute.manager [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Took 4.71 seconds to build instance.#033[00m
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.583662) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561583759, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2490, "num_deletes": 505, "total_data_size": 5062463, "memory_usage": 5140560, "flush_reason": "Manual Compaction"}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561610250, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2936030, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28321, "largest_seqno": 30806, "table_properties": {"data_size": 2926940, "index_size": 5008, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 24239, "raw_average_key_size": 20, "raw_value_size": 2906012, "raw_average_value_size": 2448, "num_data_blocks": 218, "num_entries": 1187, "num_filter_entries": 1187, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919386, "oldest_key_time": 1768919386, "file_creation_time": 1768919561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 26682 microseconds, and 6510 cpu microseconds.
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.610359) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2936030 bytes OK
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.610398) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.611493) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.611507) EVENT_LOG_v1 {"time_micros": 1768919561611503, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.611525) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5050420, prev total WAL file size 5050420, number of live WAL files 2.
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.613219) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2867KB)], [57(10MB)]
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561613287, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 13916161, "oldest_snapshot_seqno": -1}
Jan 20 09:32:41 np0005588919 nova_compute[225855]: 2026-01-20 14:32:41.621 225859 DEBUG oslo_concurrency.lockutils [None req-bdc1104b-921d-426f-88bf-03a1336d4cae 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5445 keys, 8452471 bytes, temperature: kUnknown
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561707442, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 8452471, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8416859, "index_size": 20910, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139192, "raw_average_key_size": 25, "raw_value_size": 8319448, "raw_average_value_size": 1527, "num_data_blocks": 843, "num_entries": 5445, "num_filter_entries": 5445, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.707752) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 8452471 bytes
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710197) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.6 rd, 89.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.8, 10.5 +0.0 blob) out(8.1 +0.0 blob), read-write-amplify(7.6) write-amplify(2.9) OK, records in: 6452, records dropped: 1007 output_compression: NoCompression
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.710229) EVENT_LOG_v1 {"time_micros": 1768919561710214, "job": 34, "event": "compaction_finished", "compaction_time_micros": 94261, "compaction_time_cpu_micros": 21563, "output_level": 6, "num_output_files": 1, "total_output_size": 8452471, "num_input_records": 6452, "num_output_records": 5445, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561711255, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919561714709, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.613094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.714806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.714811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.714812) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.714814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:32:41.714816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.516 225859 DEBUG nova.compute.manager [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.516 225859 DEBUG nova.compute.manager [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.517 225859 DEBUG oslo_concurrency.lockutils [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] Acquiring lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.517 225859 DEBUG oslo_concurrency.lockutils [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] Acquired lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.517 225859 DEBUG nova.network.neutron [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:32:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:42.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.788 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "965c89d3-c13e-442b-8d32-f351bf95dda5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.789 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.789 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "965c89d3-c13e-442b-8d32-f351bf95dda5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.790 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.790 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.793 225859 INFO nova.compute.manager [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Terminating instance#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.795 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:32:42 np0005588919 nova_compute[225855]: 2026-01-20 14:32:42.841 225859 DEBUG nova.network.neutron [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:43.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:43 np0005588919 nova_compute[225855]: 2026-01-20 14:32:43.419 225859 DEBUG nova.network.neutron [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:43 np0005588919 nova_compute[225855]: 2026-01-20 14:32:43.464 225859 DEBUG oslo_concurrency.lockutils [None req-c5c0e48a-d44c-438a-b8ed-0cf5dae9c9ae fab698d5ba454eb38c305f93992c91ab 0e2b622671af4e3fbcdb594e88f55f4c - - default default] Releasing lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:43 np0005588919 nova_compute[225855]: 2026-01-20 14:32:43.464 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquired lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:32:43 np0005588919 nova_compute[225855]: 2026-01-20 14:32:43.465 225859 DEBUG nova.network.neutron [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:32:43 np0005588919 nova_compute[225855]: 2026-01-20 14:32:43.735 225859 DEBUG nova.network.neutron [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.399 225859 DEBUG nova.network.neutron [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.400 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.449 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Releasing lock "refresh_cache-965c89d3-c13e-442b-8d32-f351bf95dda5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.450 225859 DEBUG nova.compute.manager [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:32:44 np0005588919 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 20 09:32:44 np0005588919 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000025.scope: Consumed 3.545s CPU time.
Jan 20 09:32:44 np0005588919 systemd-machined[194361]: Machine qemu-19-instance-00000025 terminated.
Jan 20 09:32:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:44.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.670 225859 INFO nova.virt.libvirt.driver [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance destroyed successfully.#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.670 225859 DEBUG nova.objects.instance [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lazy-loading 'resources' on Instance uuid 965c89d3-c13e-442b-8d32-f351bf95dda5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:32:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:44.702 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:32:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:44.703 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:32:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:32:44.703 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:32:44 np0005588919 nova_compute[225855]: 2026-01-20 14:32:44.704 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:45.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.336 225859 INFO nova.virt.libvirt.driver [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Deleting instance files /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5_del#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.337 225859 INFO nova.virt.libvirt.driver [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Deletion of /var/lib/nova/instances/965c89d3-c13e-442b-8d32-f351bf95dda5_del complete#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.414 225859 INFO nova.compute.manager [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.414 225859 DEBUG oslo.service.loopingcall [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.415 225859 DEBUG nova.compute.manager [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.415 225859 DEBUG nova.network.neutron [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.708 225859 DEBUG nova.network.neutron [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.747 225859 DEBUG nova.network.neutron [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.781 225859 INFO nova.compute.manager [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Took 0.37 seconds to deallocate network for instance.#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.849 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.850 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:32:45 np0005588919 nova_compute[225855]: 2026-01-20 14:32:45.962 225859 DEBUG oslo_concurrency.processutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.077 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:32:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/260108864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.425 225859 DEBUG oslo_concurrency.processutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.431 225859 DEBUG nova.compute.provider_tree [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.466 225859 DEBUG nova.scheduler.client.report [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.511 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:46.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.572 225859 INFO nova.scheduler.client.report [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Deleted allocations for instance 965c89d3-c13e-442b-8d32-f351bf95dda5#033[00m
Jan 20 09:32:46 np0005588919 nova_compute[225855]: 2026-01-20 14:32:46.646 225859 DEBUG oslo_concurrency.lockutils [None req-4c37bd00-6c2c-41c6-b3b4-146fc6d36cc7 360feb9a3f0146f0b84b6c28241e41a9 bd830773c45f46e3b6fd28d50255c383 - - default default] Lock "965c89d3-c13e-442b-8d32-f351bf95dda5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:32:47 np0005588919 podman[243497]: 2026-01-20 14:32:47.024505069 +0000 UTC m=+0.055295564 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 09:32:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:47.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:48.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:49.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:49 np0005588919 nova_compute[225855]: 2026-01-20 14:32:49.400 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:50.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:51 np0005588919 nova_compute[225855]: 2026-01-20 14:32:51.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:51.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:52.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:53.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:54 np0005588919 nova_compute[225855]: 2026-01-20 14:32:54.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:54.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:32:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:32:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:55.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:32:56 np0005588919 nova_compute[225855]: 2026-01-20 14:32:56.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:56.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:32:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:32:58.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:32:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:32:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:32:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:32:59.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:32:59 np0005588919 nova_compute[225855]: 2026-01-20 14:32:59.404 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:32:59 np0005588919 nova_compute[225855]: 2026-01-20 14:32:59.669 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919564.6684756, 965c89d3-c13e-442b-8d32-f351bf95dda5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:32:59 np0005588919 nova_compute[225855]: 2026-01-20 14:32:59.670 225859 INFO nova.compute.manager [-] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:32:59 np0005588919 nova_compute[225855]: 2026-01-20 14:32:59.736 225859 DEBUG nova.compute.manager [None req-9501c16f-bf8b-4821-9210-ef25ad4a15b9 - - - - - -] [instance: 965c89d3-c13e-442b-8d32-f351bf95dda5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:32:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:00.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:01 np0005588919 nova_compute[225855]: 2026-01-20 14:33:01.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:01.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:02.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:03.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:04 np0005588919 nova_compute[225855]: 2026-01-20 14:33:04.406 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:05.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:06 np0005588919 nova_compute[225855]: 2026-01-20 14:33:06.092 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:07 np0005588919 podman[243580]: 2026-01-20 14:33:07.041094848 +0000 UTC m=+0.085486230 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:33:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:07.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:08 np0005588919 nova_compute[225855]: 2026-01-20 14:33:08.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:08.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:08Z|00136|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 20 09:33:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:09.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:09 np0005588919 nova_compute[225855]: 2026-01-20 14:33:09.408 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:10.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:11 np0005588919 nova_compute[225855]: 2026-01-20 14:33:11.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:11.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:12.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:13.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:14 np0005588919 nova_compute[225855]: 2026-01-20 14:33:14.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:14.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:15.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:16 np0005588919 nova_compute[225855]: 2026-01-20 14:33:16.099 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:16.392 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:16.393 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:16.394 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:16.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:17.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:17 np0005588919 nova_compute[225855]: 2026-01-20 14:33:17.372 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:18 np0005588919 podman[243611]: 2026-01-20 14:33:18.038005116 +0000 UTC m=+0.083960668 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.357 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.357 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:18 np0005588919 nova_compute[225855]: 2026-01-20 14:33:18.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:33:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:18.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:19.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:19 np0005588919 nova_compute[225855]: 2026-01-20 14:33:19.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:19 np0005588919 nova_compute[225855]: 2026-01-20 14:33:19.412 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:20 np0005588919 nova_compute[225855]: 2026-01-20 14:33:20.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:20.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:21.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.411 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.411 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.411 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.412 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.412 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1270072728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:21 np0005588919 nova_compute[225855]: 2026-01-20 14:33:21.840 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.035 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.036 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4747MB free_disk=20.94301986694336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.037 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.037 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:22.170 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.170 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:22.171 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.572 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.573 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:33:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:22.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.808 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.828 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.828 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.857 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.905 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:33:22 np0005588919 nova_compute[225855]: 2026-01-20 14:33:22.927 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:23.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4270915150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.355 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.362 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.392 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.418 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.419 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.420 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:23 np0005588919 nova_compute[225855]: 2026-01-20 14:33:23.421 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.362 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.386 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:33:24 np0005588919 nova_compute[225855]: 2026-01-20 14:33:24.414 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:24.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 20 09:33:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:25.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:25 np0005588919 nova_compute[225855]: 2026-01-20 14:33:25.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:33:26 np0005588919 nova_compute[225855]: 2026-01-20 14:33:26.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:27.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:28.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:29.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:29 np0005588919 nova_compute[225855]: 2026-01-20 14:33:29.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:33:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/659662857' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:33:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:30.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:31 np0005588919 nova_compute[225855]: 2026-01-20 14:33:31.109 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:31.172 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:31.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.024 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.024 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.051 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.162 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.162 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.168 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.169 225859 INFO nova.compute.claims [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.307 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 20 09:33:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:32.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:33:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3926106385' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.778 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.786 225859 DEBUG nova.compute.provider_tree [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.816 225859 DEBUG nova.scheduler.client.report [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.845 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.846 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.953 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.953 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:33:32 np0005588919 nova_compute[225855]: 2026-01-20 14:33:32.983 225859 INFO nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.004 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.116 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.118 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.119 225859 INFO nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating image(s)#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.225 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.254 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.288 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.292 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:33.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.374 225859 DEBUG nova.policy [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f51c395107c84dbd9067113b84ff01dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a841e7a1434c488390475174e10bc161', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.391 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.392 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.392 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.393 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.417 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.420 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.864 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:33 np0005588919 nova_compute[225855]: 2026-01-20 14:33:33.975 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] resizing rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.105 225859 DEBUG nova.objects.instance [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.122 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.123 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Ensure instance console log exists: /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.124 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.124 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.124 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:34.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:34 np0005588919 nova_compute[225855]: 2026-01-20 14:33:34.674 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Successfully created port: 73e232f9-3860-4b9a-9cec-535fa2fb0c9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:33:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:35.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.112 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.210 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Successfully updated port: 73e232f9-3860-4b9a-9cec-535fa2fb0c9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.234 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.234 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquired lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.234 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.495 225859 DEBUG nova.compute.manager [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-changed-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.496 225859 DEBUG nova.compute.manager [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Refreshing instance network info cache due to event network-changed-73e232f9-3860-4b9a-9cec-535fa2fb0c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.496 225859 DEBUG oslo_concurrency.lockutils [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:33:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:36.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:36 np0005588919 nova_compute[225855]: 2026-01-20 14:33:36.615 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:33:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:38 np0005588919 podman[243925]: 2026-01-20 14:33:38.072801045 +0000 UTC m=+0.111618084 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.145 225859 DEBUG nova.network.neutron [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updating instance_info_cache with network_info: [{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.187 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Releasing lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.187 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance network_info: |[{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.188 225859 DEBUG oslo_concurrency.lockutils [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.188 225859 DEBUG nova.network.neutron [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Refreshing network info cache for port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.191 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start _get_guest_xml network_info=[{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.196 225859 WARNING nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.202 225859 DEBUG nova.virt.libvirt.host [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.203 225859 DEBUG nova.virt.libvirt.host [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.206 225859 DEBUG nova.virt.libvirt.host [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.207 225859 DEBUG nova.virt.libvirt.host [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.209 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.209 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.209 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.210 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.210 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.210 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.210 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.211 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.211 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.211 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.212 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.212 225859 DEBUG nova.virt.hardware [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.215 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:38.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:33:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/246388539' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.662 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.688 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:38 np0005588919 nova_compute[225855]: 2026-01-20 14:33:38.691 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1858393990' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.146 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.149 225859 DEBUG nova.virt.libvirt.vif [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:33:33Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.150 225859 DEBUG nova.network.os_vif_util [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.151 225859 DEBUG nova.network.os_vif_util [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.154 225859 DEBUG nova.objects.instance [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.180 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <uuid>2ec7b07d-b593-46b7-9751-b6116e4d2cec</uuid>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <name>instance-00000027</name>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersAdminTestJSON-server-1907009380</nova:name>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:33:38</nova:creationTime>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:user uuid="f51c395107c84dbd9067113b84ff01dd">tempest-ServersAdminTestJSON-1261404595-project-member</nova:user>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:project uuid="a841e7a1434c488390475174e10bc161">tempest-ServersAdminTestJSON-1261404595</nova:project>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <nova:port uuid="73e232f9-3860-4b9a-9cec-535fa2fb0c9f">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="serial">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="uuid">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:17:6a:15"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <target dev="tap73e232f9-38"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log" append="off"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:33:39 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:33:39 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:33:39 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:33:39 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.181 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Preparing to wait for external event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.182 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.182 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.182 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.183 225859 DEBUG nova.virt.libvirt.vif [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:33:33Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.183 225859 DEBUG nova.network.os_vif_util [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.184 225859 DEBUG nova.network.os_vif_util [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.184 225859 DEBUG os_vif [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.185 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.186 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.189 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.190 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73e232f9-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.190 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73e232f9-38, col_values=(('external_ids', {'iface-id': '73e232f9-3860-4b9a-9cec-535fa2fb0c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:6a:15', 'vm-uuid': '2ec7b07d-b593-46b7-9751-b6116e4d2cec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588919 NetworkManager[49104]: <info>  [1768919619.1934] manager: (tap73e232f9-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.199 225859 INFO os_vif [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.265 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.266 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.266 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No VIF found with MAC fa:16:3e:17:6a:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.267 225859 INFO nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Using config drive#033[00m
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.294 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:39.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:39 np0005588919 nova_compute[225855]: 2026-01-20 14:33:39.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:33:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.148 225859 DEBUG nova.network.neutron [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updated VIF entry in instance network info cache for port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.149 225859 DEBUG nova.network.neutron [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updating instance_info_cache with network_info: [{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.165 225859 DEBUG oslo_concurrency.lockutils [req-691c09c7-5da3-4b8f-b87d-e0cc6eca78cf req-1e9ac1fd-0c15-4f3e-9cb9-078609bea9a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.458 225859 INFO nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating config drive at /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.468 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsa93env5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:40.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.601 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsa93env5" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.648 225859 DEBUG nova.storage.rbd_utils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:33:40 np0005588919 nova_compute[225855]: 2026-01-20 14:33:40.652 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:33:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:41.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.376 225859 DEBUG oslo_concurrency.processutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.724s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.378 225859 INFO nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting local config drive /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config because it was imported into RBD.#033[00m
Jan 20 09:33:41 np0005588919 kernel: tap73e232f9-38: entered promiscuous mode
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.4491] manager: (tap73e232f9-38): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 20 09:33:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:41Z|00137|binding|INFO|Claiming lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f for this chassis.
Jan 20 09:33:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:41Z|00138|binding|INFO|73e232f9-3860-4b9a-9cec-535fa2fb0c9f: Claiming fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.450 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.453 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.464 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.466 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a bound to our chassis#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.468 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.483 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6a28aced-99c2-4e7b-b7e1-c06012c9b41f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.484 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap33c9a20a-d1 in ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.486 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap33c9a20a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.486 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45615288-289e-421e-82c4-19a0aaf45265]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.487 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[767f890e-ead5-46ab-badc-e19461397440]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 systemd-machined[194361]: New machine qemu-20-instance-00000027.
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.502 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[06837b5d-3352-4254-8270-96974dc7cbf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 systemd[1]: Started Virtual Machine qemu-20-instance-00000027.
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.525 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba63c86-38c7-4925-a759-daa1697f1099]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 systemd-udevd[244224]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.5440] device (tap73e232f9-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.5449] device (tap73e232f9-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:41Z|00139|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f ovn-installed in OVS
Jan 20 09:33:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:41Z|00140|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f up in Southbound
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.557 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.563 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6fc406-0c5e-406f-be89-8b60e399207a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.568 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f99db80f-a8e7-49d4-9a97-4fa82f9ad190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.5691] manager: (tap33c9a20a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.602 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[836acec3-7e29-4a71-9f1c-e296c72f2740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.605 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[afeaf2fb-b75d-46ed-8bd0-31a3d0c18033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.6249] device (tap33c9a20a-d0): carrier: link connected
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.629 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b01108-15b1-4fdb-9c7d-6c4bfbc3976e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.643 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e53c7bed-6bcb-4841-bb60-2e7b887f45a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244255, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.657 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42276c09-2b87-4199-80df-f90acaed1401]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:8ebd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466055, 'tstamp': 466055}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244256, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.671 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[32288afa-dc8b-410f-b6a7-9403acf6a732]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244257, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.702 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[91b860d9-3574-41ab-a6c4-ec1d1884583f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.757 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[748c4050-38bd-416d-9c36-b5f2af429643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:41 np0005588919 kernel: tap33c9a20a-d0: entered promiscuous mode
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.760 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.763 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:33:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:41Z|00141|binding|INFO|Releasing lport 90c69687-c788-4dba-881f-3ed4a5ee6007 from this chassis (sb_readonly=0)
Jan 20 09:33:41 np0005588919 NetworkManager[49104]: <info>  [1768919621.7651] manager: (tap33c9a20a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.766 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.778 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:33:41 np0005588919 nova_compute[225855]: 2026-01-20 14:33:41.779 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.779 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd7344c-3d09-4004-a222-40748b8cc6ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.780 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.pid.haproxy
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:33:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:33:41.780 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'env', 'PROCESS_TAG=haproxy-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/33c9a20a-d976-42a8-b8bf-f83ddfc97c9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:33:42 np0005588919 podman[244318]: 2026-01-20 14:33:42.140394803 +0000 UTC m=+0.051027527 container create ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:33:42 np0005588919 systemd[1]: Started libpod-conmon-ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d.scope.
Jan 20 09:33:42 np0005588919 podman[244318]: 2026-01-20 14:33:42.113576149 +0000 UTC m=+0.024208893 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:33:42 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:33:42 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f71f999ebe55922d4470c26bf6ec7028f2091bfc297c60f9663a1040a21c70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:33:42 np0005588919 podman[244318]: 2026-01-20 14:33:42.247746929 +0000 UTC m=+0.158379673 container init ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:33:42 np0005588919 podman[244318]: 2026-01-20 14:33:42.257783924 +0000 UTC m=+0.168416648 container start ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:33:42 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [NOTICE]   (244356) : New worker (244358) forked
Jan 20 09:33:42 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [NOTICE]   (244356) : Loading success.
Jan 20 09:33:42 np0005588919 nova_compute[225855]: 2026-01-20 14:33:42.371 225859 DEBUG nova.compute.manager [req-2b3a0c21-208a-4e3a-94ca-593540b554b0 req-da37091d-4475-4c01-a2f6-3ed8875f23d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:42 np0005588919 nova_compute[225855]: 2026-01-20 14:33:42.371 225859 DEBUG oslo_concurrency.lockutils [req-2b3a0c21-208a-4e3a-94ca-593540b554b0 req-da37091d-4475-4c01-a2f6-3ed8875f23d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:42 np0005588919 nova_compute[225855]: 2026-01-20 14:33:42.372 225859 DEBUG oslo_concurrency.lockutils [req-2b3a0c21-208a-4e3a-94ca-593540b554b0 req-da37091d-4475-4c01-a2f6-3ed8875f23d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:42 np0005588919 nova_compute[225855]: 2026-01-20 14:33:42.372 225859 DEBUG oslo_concurrency.lockutils [req-2b3a0c21-208a-4e3a-94ca-593540b554b0 req-da37091d-4475-4c01-a2f6-3ed8875f23d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:42 np0005588919 nova_compute[225855]: 2026-01-20 14:33:42.372 225859 DEBUG nova.compute.manager [req-2b3a0c21-208a-4e3a-94ca-593540b554b0 req-da37091d-4475-4c01-a2f6-3ed8875f23d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Processing event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:33:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:42.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:43.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.579 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.580 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919623.5791976, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.580 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Started (Lifecycle Event)#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.583 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.586 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance spawned successfully.#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.587 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.615 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.620 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.620 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.621 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.621 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.622 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.622 225859 DEBUG nova.virt.libvirt.driver [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.626 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.682 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.682 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919623.5800834, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.683 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.712 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.715 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919623.58235, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.715 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.736 225859 INFO nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Took 10.62 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.736 225859 DEBUG nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.742 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.744 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.769 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.804 225859 INFO nova.compute.manager [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Took 11.67 seconds to build instance.#033[00m
Jan 20 09:33:43 np0005588919 nova_compute[225855]: 2026-01-20 14:33:43.828 225859 DEBUG oslo_concurrency.lockutils [None req-36c56da3-e6dd-43b3-8db2-5f7143100150 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.421 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.510 225859 DEBUG nova.compute.manager [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.510 225859 DEBUG oslo_concurrency.lockutils [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.511 225859 DEBUG oslo_concurrency.lockutils [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.511 225859 DEBUG oslo_concurrency.lockutils [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.512 225859 DEBUG nova.compute.manager [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:33:44 np0005588919 nova_compute[225855]: 2026-01-20 14:33:44.512 225859 WARNING nova.compute.manager [req-3a254025-3885-43c8-811e-99352aa32552 req-bb977258-209d-4e70-9691-3a34a9fe326a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state None.#033[00m
Jan 20 09:33:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:44.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:45.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:46.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:47.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:48.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:49 np0005588919 podman[244412]: 2026-01-20 14:33:49.023084387 +0000 UTC m=+0.066953372 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:33:49 np0005588919 nova_compute[225855]: 2026-01-20 14:33:49.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:49.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:49 np0005588919 nova_compute[225855]: 2026-01-20 14:33:49.423 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:50.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:51.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:33:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:52.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:53.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:54 np0005588919 nova_compute[225855]: 2026-01-20 14:33:54.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:54 np0005588919 nova_compute[225855]: 2026-01-20 14:33:54.458 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:54.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:33:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:33:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:55.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:33:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:56Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:33:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:33:56Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:33:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:56.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:33:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:57.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:33:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:33:58.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:33:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:33:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:33:59.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.460 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.502 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.503 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.531 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.679 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.679 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.686 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.687 225859 INFO nova.compute.claims [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:33:59 np0005588919 nova_compute[225855]: 2026-01-20 14:33:59.934 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2444071069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.472 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.478 225859 DEBUG nova.compute.provider_tree [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.495 225859 DEBUG nova.scheduler.client.report [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.539 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.540 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.590 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.591 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:34:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:00.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.616 225859 INFO nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.634 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.735 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.736 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.736 225859 INFO nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Creating image(s)#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.762 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.788 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.826 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.831 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.909 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.910 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.910 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.911 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.935 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:00 np0005588919 nova_compute[225855]: 2026-01-20 14:34:00.939 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdba30ff-e02a-4857-92f6-1828ce3ab175_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:01 np0005588919 nova_compute[225855]: 2026-01-20 14:34:01.017 225859 DEBUG nova.policy [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f51c395107c84dbd9067113b84ff01dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a841e7a1434c488390475174e10bc161', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:34:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:01.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:01 np0005588919 nova_compute[225855]: 2026-01-20 14:34:01.824 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdba30ff-e02a-4857-92f6-1828ce3ab175_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.886s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:01 np0005588919 nova_compute[225855]: 2026-01-20 14:34:01.894 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] resizing rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:34:01 np0005588919 nova_compute[225855]: 2026-01-20 14:34:01.987 225859 DEBUG nova.objects.instance [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'migration_context' on Instance uuid fdba30ff-e02a-4857-92f6-1828ce3ab175 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:02 np0005588919 nova_compute[225855]: 2026-01-20 14:34:02.004 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:34:02 np0005588919 nova_compute[225855]: 2026-01-20 14:34:02.005 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Ensure instance console log exists: /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:34:02 np0005588919 nova_compute[225855]: 2026-01-20 14:34:02.005 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:02 np0005588919 nova_compute[225855]: 2026-01-20 14:34:02.006 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:02 np0005588919 nova_compute[225855]: 2026-01-20 14:34:02.006 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:02.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:03 np0005588919 nova_compute[225855]: 2026-01-20 14:34:03.340 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Successfully created port: 87a0a5ba-6446-4265-8ada-94d1bd815aed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:34:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:03.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:34:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/424104226' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:34:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:34:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/424104226' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.209 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.462 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:04.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.692 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Successfully updated port: 87a0a5ba-6446-4265-8ada-94d1bd815aed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.707 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.707 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquired lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:04 np0005588919 nova_compute[225855]: 2026-01-20 14:34:04.707 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:34:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:05 np0005588919 nova_compute[225855]: 2026-01-20 14:34:05.120 225859 DEBUG nova.compute.manager [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-changed-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:05 np0005588919 nova_compute[225855]: 2026-01-20 14:34:05.121 225859 DEBUG nova.compute.manager [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Refreshing instance network info cache due to event network-changed-87a0a5ba-6446-4265-8ada-94d1bd815aed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:34:05 np0005588919 nova_compute[225855]: 2026-01-20 14:34:05.121 225859 DEBUG oslo_concurrency.lockutils [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:05.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:05 np0005588919 nova_compute[225855]: 2026-01-20 14:34:05.509 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:34:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:06.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:07.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:08.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:09 np0005588919 podman[244732]: 2026-01-20 14:34:09.127758117 +0000 UTC m=+0.170242038 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 20 09:34:09 np0005588919 nova_compute[225855]: 2026-01-20 14:34:09.212 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:09.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:09 np0005588919 nova_compute[225855]: 2026-01-20 14:34:09.464 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.248 225859 DEBUG nova.network.neutron [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Updating instance_info_cache with network_info: [{"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.294 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Releasing lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.294 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Instance network_info: |[{"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.295 225859 DEBUG oslo_concurrency.lockutils [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.295 225859 DEBUG nova.network.neutron [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Refreshing network info cache for port 87a0a5ba-6446-4265-8ada-94d1bd815aed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.298 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Start _get_guest_xml network_info=[{"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.302 225859 WARNING nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.313 225859 DEBUG nova.virt.libvirt.host [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.314 225859 DEBUG nova.virt.libvirt.host [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.317 225859 DEBUG nova.virt.libvirt.host [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.318 225859 DEBUG nova.virt.libvirt.host [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.319 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.319 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.320 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.320 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.320 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.321 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.321 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.321 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.321 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.321 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.322 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.322 225859 DEBUG nova.virt.hardware [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:34:10 np0005588919 nova_compute[225855]: 2026-01-20 14:34:10.324 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:10.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3250290687' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.214 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.890s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.247 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.252 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:11.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1227090643' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.711 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.713 225859 DEBUG nova.virt.libvirt.vif [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1832306325',display_name='tempest-ServersAdminTestJSON-server-1832306325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1832306325',id=42,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-sb3w0f0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:00Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=fdba30ff-e02a-4857-92f6-1828ce3ab175,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.713 225859 DEBUG nova.network.os_vif_util [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.714 225859 DEBUG nova.network.os_vif_util [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.716 225859 DEBUG nova.objects.instance [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_devices' on Instance uuid fdba30ff-e02a-4857-92f6-1828ce3ab175 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.741 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <uuid>fdba30ff-e02a-4857-92f6-1828ce3ab175</uuid>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <name>instance-0000002a</name>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersAdminTestJSON-server-1832306325</nova:name>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:34:10</nova:creationTime>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:user uuid="f51c395107c84dbd9067113b84ff01dd">tempest-ServersAdminTestJSON-1261404595-project-member</nova:user>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:project uuid="a841e7a1434c488390475174e10bc161">tempest-ServersAdminTestJSON-1261404595</nova:project>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <nova:port uuid="87a0a5ba-6446-4265-8ada-94d1bd815aed">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="serial">fdba30ff-e02a-4857-92f6-1828ce3ab175</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="uuid">fdba30ff-e02a-4857-92f6-1828ce3ab175</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdba30ff-e02a-4857-92f6-1828ce3ab175_disk">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:70:1f:46"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <target dev="tap87a0a5ba-64"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/console.log" append="off"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:34:11 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:34:11 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:34:11 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:34:11 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.744 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Preparing to wait for external event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.744 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.745 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.745 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.747 225859 DEBUG nova.virt.libvirt.vif [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1832306325',display_name='tempest-ServersAdminTestJSON-server-1832306325',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1832306325',id=42,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-sb3w0f0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:00Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=fdba30ff-e02a-4857-92f6-1828ce3ab175,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.747 225859 DEBUG nova.network.os_vif_util [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.748 225859 DEBUG nova.network.os_vif_util [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.749 225859 DEBUG os_vif [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.750 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.751 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.752 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.757 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.758 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87a0a5ba-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.759 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87a0a5ba-64, col_values=(('external_ids', {'iface-id': '87a0a5ba-6446-4265-8ada-94d1bd815aed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:1f:46', 'vm-uuid': 'fdba30ff-e02a-4857-92f6-1828ce3ab175'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.803 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:11 np0005588919 NetworkManager[49104]: <info>  [1768919651.8041] manager: (tap87a0a5ba-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.806 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.810 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.812 225859 INFO os_vif [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64')#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.870 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.871 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.871 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No VIF found with MAC fa:16:3e:70:1f:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.872 225859 INFO nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Using config drive#033[00m
Jan 20 09:34:11 np0005588919 nova_compute[225855]: 2026-01-20 14:34:11.897 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:12.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:12 np0005588919 nova_compute[225855]: 2026-01-20 14:34:12.813 225859 INFO nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Creating config drive at /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config#033[00m
Jan 20 09:34:12 np0005588919 nova_compute[225855]: 2026-01-20 14:34:12.822 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_b6rd1t4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:12 np0005588919 nova_compute[225855]: 2026-01-20 14:34:12.961 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_b6rd1t4" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:13 np0005588919 nova_compute[225855]: 2026-01-20 14:34:12.999 225859 DEBUG nova.storage.rbd_utils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:13 np0005588919 nova_compute[225855]: 2026-01-20 14:34:13.002 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:13 np0005588919 nova_compute[225855]: 2026-01-20 14:34:13.100 225859 DEBUG nova.network.neutron [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Updated VIF entry in instance network info cache for port 87a0a5ba-6446-4265-8ada-94d1bd815aed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:34:13 np0005588919 nova_compute[225855]: 2026-01-20 14:34:13.101 225859 DEBUG nova.network.neutron [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Updating instance_info_cache with network_info: [{"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:13 np0005588919 nova_compute[225855]: 2026-01-20 14:34:13.137 225859 DEBUG oslo_concurrency.lockutils [req-e6e71d3b-b359-4542-89e0-33c13e68dac2 req-6433cca5-2ea3-4f3f-bfa8-5d7817c303df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdba30ff-e02a-4857-92f6-1828ce3ab175" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:13.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:34:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/467321397' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:34:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:34:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/467321397' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.375 225859 DEBUG oslo_concurrency.processutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config fdba30ff-e02a-4857-92f6-1828ce3ab175_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.373s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.376 225859 INFO nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Deleting local config drive /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175/disk.config because it was imported into RBD.#033[00m
Jan 20 09:34:14 np0005588919 kernel: tap87a0a5ba-64: entered promiscuous mode
Jan 20 09:34:14 np0005588919 NetworkManager[49104]: <info>  [1768919654.4383] manager: (tap87a0a5ba-64): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 20 09:34:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:14Z|00142|binding|INFO|Claiming lport 87a0a5ba-6446-4265-8ada-94d1bd815aed for this chassis.
Jan 20 09:34:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:14Z|00143|binding|INFO|87a0a5ba-6446-4265-8ada-94d1bd815aed: Claiming fa:16:3e:70:1f:46 10.100.0.10
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.452 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:1f:46 10.100.0.10'], port_security=['fa:16:3e:70:1f:46 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fdba30ff-e02a-4857-92f6-1828ce3ab175', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=87a0a5ba-6446-4265-8ada-94d1bd815aed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.455 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 87a0a5ba-6446-4265-8ada-94d1bd815aed in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a bound to our chassis#033[00m
Jan 20 09:34:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:14Z|00144|binding|INFO|Setting lport 87a0a5ba-6446-4265-8ada-94d1bd815aed ovn-installed in OVS
Jan 20 09:34:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:14Z|00145|binding|INFO|Setting lport 87a0a5ba-6446-4265-8ada-94d1bd815aed up in Southbound
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.458 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.458 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.463 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.465 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 systemd-machined[194361]: New machine qemu-21-instance-0000002a.
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.475 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8de91ed2-086f-49d4-841f-14895bcf6adb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 systemd[1]: Started Virtual Machine qemu-21-instance-0000002a.
Jan 20 09:34:14 np0005588919 systemd-udevd[244901]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:34:14 np0005588919 NetworkManager[49104]: <info>  [1768919654.5142] device (tap87a0a5ba-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:34:14 np0005588919 NetworkManager[49104]: <info>  [1768919654.5153] device (tap87a0a5ba-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.516 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2672983d-b4ba-4f64-a515-f040ee4837c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.519 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6566d4-096f-41e6-bdae-74d806ff7ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.554 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4259060a-1af6-40d6-aed6-7dd314c6af57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.575 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c974546-29f8-4316-8308-23066b2692af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244911, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.594 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d0bae0-23e6-488f-ac49-5883a0190f3e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244913, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244913, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.596 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.597 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.599 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.599 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.600 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.600 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:14.600 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:14.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.963 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919654.962634, fdba30ff-e02a-4857-92f6-1828ce3ab175 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.963 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] VM Started (Lifecycle Event)#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.988 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.993 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919654.9627392, fdba30ff-e02a-4857-92f6-1828ce3ab175 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:14 np0005588919 nova_compute[225855]: 2026-01-20 14:34:14.993 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:34:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.012 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.015 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.033 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.103 225859 DEBUG nova.compute.manager [req-369ef029-38d8-4f4f-84c5-bdc8c0258d9a req-4a938ab4-c5d1-4026-a47f-9a7d908ebddd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.104 225859 DEBUG oslo_concurrency.lockutils [req-369ef029-38d8-4f4f-84c5-bdc8c0258d9a req-4a938ab4-c5d1-4026-a47f-9a7d908ebddd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.104 225859 DEBUG oslo_concurrency.lockutils [req-369ef029-38d8-4f4f-84c5-bdc8c0258d9a req-4a938ab4-c5d1-4026-a47f-9a7d908ebddd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.104 225859 DEBUG oslo_concurrency.lockutils [req-369ef029-38d8-4f4f-84c5-bdc8c0258d9a req-4a938ab4-c5d1-4026-a47f-9a7d908ebddd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.105 225859 DEBUG nova.compute.manager [req-369ef029-38d8-4f4f-84c5-bdc8c0258d9a req-4a938ab4-c5d1-4026-a47f-9a7d908ebddd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Processing event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.105 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.109 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919655.1083775, fdba30ff-e02a-4857-92f6-1828ce3ab175 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.109 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.110 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.113 225859 INFO nova.virt.libvirt.driver [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Instance spawned successfully.#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.114 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.142 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.147 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.147 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.148 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.148 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.149 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.149 225859 DEBUG nova.virt.libvirt.driver [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.153 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.201 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.224 225859 INFO nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Took 14.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.225 225859 DEBUG nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.312 225859 INFO nova.compute.manager [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Took 15.73 seconds to build instance.#033[00m
Jan 20 09:34:15 np0005588919 nova_compute[225855]: 2026-01-20 14:34:15.332 225859 DEBUG oslo_concurrency.lockutils [None req-fad19961-02ca-4563-bcb6-95c343d93c0e f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:15.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:16.392 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:16.393 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:16.393 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:16.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:16 np0005588919 nova_compute[225855]: 2026-01-20 14:34:16.805 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.210 225859 DEBUG nova.compute.manager [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.210 225859 DEBUG oslo_concurrency.lockutils [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.211 225859 DEBUG oslo_concurrency.lockutils [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.211 225859 DEBUG oslo_concurrency.lockutils [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.212 225859 DEBUG nova.compute.manager [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] No waiting events found dispatching network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:17 np0005588919 nova_compute[225855]: 2026-01-20 14:34:17.212 225859 WARNING nova.compute.manager [req-ad17f70f-ff76-489a-a0c9-eeca4723e611 req-854a633c-2da8-4aed-aabf-298119db7c86 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received unexpected event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed for instance with vm_state active and task_state None.#033[00m
Jan 20 09:34:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:17.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:18 np0005588919 nova_compute[225855]: 2026-01-20 14:34:18.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:18 np0005588919 nova_compute[225855]: 2026-01-20 14:34:18.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:34:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:18.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:19 np0005588919 nova_compute[225855]: 2026-01-20 14:34:19.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:19.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:19 np0005588919 nova_compute[225855]: 2026-01-20 14:34:19.467 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:20 np0005588919 podman[244959]: 2026-01-20 14:34:20.023638502 +0000 UTC m=+0.065904764 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.590 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.590 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.591 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:34:20 np0005588919 nova_compute[225855]: 2026-01-20 14:34:20.592 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:20.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:21.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:21 np0005588919 nova_compute[225855]: 2026-01-20 14:34:21.808 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:22.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:23.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:23.630 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:23 np0005588919 nova_compute[225855]: 2026-01-20 14:34:23.630 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:23.633 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:34:23 np0005588919 nova_compute[225855]: 2026-01-20 14:34:23.998 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updating instance_info_cache with network_info: [{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.024 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-2ec7b07d-b593-46b7-9751-b6116e4d2cec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.024 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.025 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.025 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.026 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.051 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.052 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.052 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.053 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.053 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3789156520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.553 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.628 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.629 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000002a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.633 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.633 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:34:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:24.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.788 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.790 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4375MB free_disk=20.876426696777344GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.791 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.791 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.861 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 2ec7b07d-b593-46b7-9751-b6116e4d2cec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.862 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance fdba30ff-e02a-4857-92f6-1828ce3ab175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.862 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.862 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:34:24 np0005588919 nova_compute[225855]: 2026-01-20 14:34:24.953 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1416592101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.387 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.395 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.415 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:25.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.446 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.447 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.761 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:25 np0005588919 nova_compute[225855]: 2026-01-20 14:34:25.762 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:26 np0005588919 nova_compute[225855]: 2026-01-20 14:34:26.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:26 np0005588919 nova_compute[225855]: 2026-01-20 14:34:26.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:34:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:26.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:26 np0005588919 nova_compute[225855]: 2026-01-20 14:34:26.858 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:27.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:28.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:28Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:70:1f:46 10.100.0.10
Jan 20 09:34:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:28Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:70:1f:46 10.100.0.10
Jan 20 09:34:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:29.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:29 np0005588919 nova_compute[225855]: 2026-01-20 14:34:29.524 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:30.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:31.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:31.636 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:31 np0005588919 nova_compute[225855]: 2026-01-20 14:34:31.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:32.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:33.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:34 np0005588919 nova_compute[225855]: 2026-01-20 14:34:34.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:34.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.003000082s ======
Jan 20 09:34:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:35.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000082s
Jan 20 09:34:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:36.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:36 np0005588919 nova_compute[225855]: 2026-01-20 14:34:36.864 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.453 225859 INFO nova.compute.manager [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Rebuilding instance#033[00m
Jan 20 09:34:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:37.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.764 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.885 225859 DEBUG nova.compute.manager [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.939 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.953 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.970 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'resources' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:37 np0005588919 nova_compute[225855]: 2026-01-20 14:34:37.985 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:38 np0005588919 nova_compute[225855]: 2026-01-20 14:34:38.002 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:34:38 np0005588919 nova_compute[225855]: 2026-01-20 14:34:38.005 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:34:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:38.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 20 09:34:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:39.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:39 np0005588919 nova_compute[225855]: 2026-01-20 14:34:39.530 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:40 np0005588919 podman[245084]: 2026-01-20 14:34:40.100478708 +0000 UTC m=+0.120281061 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:34:40 np0005588919 kernel: tap73e232f9-38 (unregistering): left promiscuous mode
Jan 20 09:34:40 np0005588919 NetworkManager[49104]: <info>  [1768919680.5024] device (tap73e232f9-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:34:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:40Z|00146|binding|INFO|Releasing lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f from this chassis (sb_readonly=0)
Jan 20 09:34:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:40Z|00147|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f down in Southbound
Jan 20 09:34:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:40Z|00148|binding|INFO|Removing iface tap73e232f9-38 ovn-installed in OVS
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.525 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.528 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.533 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.534 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a unbound from our chassis#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.535 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.545 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.558 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a24ad195-11dc-492e-978a-7fab7eb64e4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 20 09:34:40 np0005588919 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000027.scope: Consumed 16.355s CPU time.
Jan 20 09:34:40 np0005588919 systemd-machined[194361]: Machine qemu-20-instance-00000027 terminated.
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.597 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bd907b1c-2921-4e4b-9235-d294c88437a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.602 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d8cd6ad1-786f-47c7-a4c9-35b3e180237c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.633 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f32a5a-3a38-4346-962f-5b83ffeb46f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 20 09:34:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:40.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.656 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0199eefa-c47f-469c-b4df-29ceaef26d29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245123, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.677 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c4ede9-c547-47ec-b9f3-653f018ff56e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245124, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245124, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.679 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.680 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.686 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.687 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.688 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.688 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:40.689 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.742 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.908 225859 DEBUG nova.compute.manager [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.910 225859 DEBUG oslo_concurrency.lockutils [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.910 225859 DEBUG oslo_concurrency.lockutils [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.911 225859 DEBUG oslo_concurrency.lockutils [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.911 225859 DEBUG nova.compute.manager [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:40 np0005588919 nova_compute[225855]: 2026-01-20 14:34:40.912 225859 WARNING nova.compute.manager [req-74e8a092-04cf-44ad-9d96-951b1c1821b4 req-8325b682-ef38-4fa9-bc7d-1182d037eb70 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state error and task_state rebuilding.#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.026 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.036 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance destroyed successfully.#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.044 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance destroyed successfully.#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.045 225859 DEBUG nova.virt.libvirt.vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:33:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:36Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.046 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.046 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.047 225859 DEBUG os_vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.049 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73e232f9-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.053 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:41 np0005588919 nova_compute[225855]: 2026-01-20 14:34:41.056 225859 INFO os_vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:34:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:41.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:42.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:42.998 225859 DEBUG nova.compute.manager [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:43.000 225859 DEBUG oslo_concurrency.lockutils [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:43.000 225859 DEBUG oslo_concurrency.lockutils [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:43.001 225859 DEBUG oslo_concurrency.lockutils [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:43.001 225859 DEBUG nova.compute.manager [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:43 np0005588919 nova_compute[225855]: 2026-01-20 14:34:43.002 225859 WARNING nova.compute.manager [req-fc1a4841-4f7d-4d2b-a19c-ce0ab9853f2d req-e28d6c9f-49ac-4838-a5cf-3112179360c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state error and task_state rebuilding.#033[00m
Jan 20 09:34:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:43.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.532 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.542 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting instance files /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.543 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deletion of /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del complete#033[00m
Jan 20 09:34:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:44.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.693 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.694 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating image(s)#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.724 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.759 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.789 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.794 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:44 np0005588919 nova_compute[225855]: 2026-01-20 14:34:44.795 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:45 np0005588919 nova_compute[225855]: 2026-01-20 14:34:45.211 225859 DEBUG nova.virt.libvirt.imagebackend [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/26699514-f465-4b50-98b7-36f2cfc6a308/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/26699514-f465-4b50-98b7-36f2cfc6a308/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:34:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:45.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.052 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.788 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.857 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.858 225859 DEBUG nova.virt.images [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] 26699514-f465-4b50-98b7-36f2cfc6a308 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.859 225859 DEBUG nova.privsep.utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 20 09:34:46 np0005588919 nova_compute[225855]: 2026-01-20 14:34:46.859 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.082 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.part /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.087 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.144 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c.converted --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.145 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.176 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.179 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:47.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.516 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.601 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] resizing rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.704 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.704 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Ensure instance console log exists: /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.705 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.705 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.705 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.707 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start _get_guest_xml network_info=[{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.711 225859 WARNING nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.724 225859 DEBUG nova.virt.libvirt.host [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.725 225859 DEBUG nova.virt.libvirt.host [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.737 225859 DEBUG nova.virt.libvirt.host [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.737 225859 DEBUG nova.virt.libvirt.host [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.738 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.738 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.739 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.739 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.739 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.739 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.739 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.740 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.740 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.740 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.740 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.741 225859 DEBUG nova.virt.hardware [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.741 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:47 np0005588919 nova_compute[225855]: 2026-01-20 14:34:47.765 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 20 09:34:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/80779919' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.243 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.271 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.275 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:48.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/578134844' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.698 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.701 225859 DEBUG nova.virt.libvirt.vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:33:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:44Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.701 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.702 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.706 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <uuid>2ec7b07d-b593-46b7-9751-b6116e4d2cec</uuid>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <name>instance-00000027</name>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersAdminTestJSON-server-1907009380</nova:name>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:34:47</nova:creationTime>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:user uuid="f51c395107c84dbd9067113b84ff01dd">tempest-ServersAdminTestJSON-1261404595-project-member</nova:user>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:project uuid="a841e7a1434c488390475174e10bc161">tempest-ServersAdminTestJSON-1261404595</nova:project>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <nova:port uuid="73e232f9-3860-4b9a-9cec-535fa2fb0c9f">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="serial">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="uuid">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:17:6a:15"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <target dev="tap73e232f9-38"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log" append="off"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:34:48 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:34:48 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:34:48 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:34:48 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.708 225859 DEBUG nova.virt.libvirt.vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:33:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:44Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.709 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.710 225859 DEBUG nova.network.os_vif_util [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.710 225859 DEBUG os_vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.711 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.712 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.712 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.714 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.715 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73e232f9-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.715 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73e232f9-38, col_values=(('external_ids', {'iface-id': '73e232f9-3860-4b9a-9cec-535fa2fb0c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:6a:15', 'vm-uuid': '2ec7b07d-b593-46b7-9751-b6116e4d2cec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:48 np0005588919 NetworkManager[49104]: <info>  [1768919688.7177] manager: (tap73e232f9-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.719 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.723 225859 INFO os_vif [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.777 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.777 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.778 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No VIF found with MAC fa:16:3e:17:6a:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.778 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Using config drive#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.811 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.831 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:48 np0005588919 nova_compute[225855]: 2026-01-20 14:34:48.856 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'keypairs' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 20 09:34:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:49.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.535 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.755 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating config drive at /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config#033[00m
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.761 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptkclbi5w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.902 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptkclbi5w" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.940 225859 DEBUG nova.storage.rbd_utils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:49 np0005588919 nova_compute[225855]: 2026-01-20 14:34:49.944 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.142 225859 DEBUG oslo_concurrency.processutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.143 225859 INFO nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting local config drive /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config because it was imported into RBD.#033[00m
Jan 20 09:34:50 np0005588919 kernel: tap73e232f9-38: entered promiscuous mode
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.215 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:50Z|00149|binding|INFO|Claiming lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f for this chassis.
Jan 20 09:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:50Z|00150|binding|INFO|73e232f9-3860-4b9a-9cec-535fa2fb0c9f: Claiming fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:34:50 np0005588919 NetworkManager[49104]: <info>  [1768919690.2155] manager: (tap73e232f9-38): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.225 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.228 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a bound to our chassis#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.230 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:50Z|00151|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f ovn-installed in OVS
Jan 20 09:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:34:50Z|00152|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f up in Southbound
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.233 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.238 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.247 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[abb1a060-1475-49c3-8c07-62eae2cd9652]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 systemd-udevd[245533]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:34:50 np0005588919 systemd-machined[194361]: New machine qemu-22-instance-00000027.
Jan 20 09:34:50 np0005588919 systemd[1]: Started Virtual Machine qemu-22-instance-00000027.
Jan 20 09:34:50 np0005588919 NetworkManager[49104]: <info>  [1768919690.2800] device (tap73e232f9-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:34:50 np0005588919 NetworkManager[49104]: <info>  [1768919690.2810] device (tap73e232f9-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.288 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9fe3f9-894f-4fc7-ade2-948f5e9afe68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.293 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0c6e71-7096-40ff-a05a-04fc9e626ecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.331 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0c72ab-4057-4705-b901-98becdbad45c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 podman[245522]: 2026-01-20 14:34:50.33318228 +0000 UTC m=+0.082209029 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.349 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ced0a71e-445b-4e10-a232-7f3e47d2ac22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245556, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.369 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dff825a1-5291-47b9-a6d1-f8e0a4f2370a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245557, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245557, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.370 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.372 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.373 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.373 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.374 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.374 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:34:50.374 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.516 225859 DEBUG nova.compute.manager [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.516 225859 DEBUG oslo_concurrency.lockutils [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.517 225859 DEBUG oslo_concurrency.lockutils [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.517 225859 DEBUG oslo_concurrency.lockutils [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.518 225859 DEBUG nova.compute.manager [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:50 np0005588919 nova_compute[225855]: 2026-01-20 14:34:50.518 225859 WARNING nova.compute.manager [req-54fcfc19-c368-470e-834a-fa9cb874c9b7 req-e99bbeea-af60-4fcc-82cb-61f48007815e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state error and task_state rebuild_spawning.#033[00m
Jan 20 09:34:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:50.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 20 09:34:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:51.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.726 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 2ec7b07d-b593-46b7-9751-b6116e4d2cec due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.727 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919691.7263894, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.727 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.731 225859 DEBUG nova.compute.manager [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.732 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.736 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance spawned successfully.#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.736 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.757 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.768 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.772 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.772 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.773 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.773 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.774 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.775 225859 DEBUG nova.virt.libvirt.driver [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.804 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.805 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919691.7274077, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.805 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Started (Lifecycle Event)#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.832 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.835 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.842 225859 DEBUG nova.compute.manager [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.852 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.904 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.904 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:51 np0005588919 nova_compute[225855]: 2026-01-20 14:34:51.905 225859 DEBUG nova.objects.instance [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.041 225859 DEBUG oslo_concurrency.lockutils [None req-794cd3f2-804a-4c65-baab-dd821341efb7 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 20 09:34:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.617 225859 DEBUG nova.compute.manager [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.617 225859 DEBUG oslo_concurrency.lockutils [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.618 225859 DEBUG oslo_concurrency.lockutils [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.618 225859 DEBUG oslo_concurrency.lockutils [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.618 225859 DEBUG nova.compute.manager [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:34:52 np0005588919 nova_compute[225855]: 2026-01-20 14:34:52.618 225859 WARNING nova.compute.manager [req-a0af8d46-3a12-443e-b9a2-8f602873e7e8 req-6ab3ced8-2443-41ba-909e-a2f5350e50a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state None.#033[00m
Jan 20 09:34:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:52.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:53 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:34:53 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:34:53 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:34:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:53.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:53 np0005588919 nova_compute[225855]: 2026-01-20 14:34:53.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 20 09:34:54 np0005588919 nova_compute[225855]: 2026-01-20 14:34:54.537 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 20 09:34:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:54.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 20 09:34:54 np0005588919 nova_compute[225855]: 2026-01-20 14:34:54.675 225859 INFO nova.compute.manager [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Rebuilding instance#033[00m
Jan 20 09:34:54 np0005588919 nova_compute[225855]: 2026-01-20 14:34:54.996 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.015 225859 DEBUG nova.compute.manager [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.067 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.080 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.091 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'resources' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.101 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.112 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:34:55 np0005588919 nova_compute[225855]: 2026-01-20 14:34:55.116 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:34:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:56.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.391 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "680a9e49-0486-46a0-8857-99a7a56c46e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.392 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.417 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.483 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.484 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:57.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.493 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.494 225859 INFO nova.compute.claims [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:34:57 np0005588919 nova_compute[225855]: 2026-01-20 14:34:57.641 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:34:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3207990597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.109 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.115 225859 DEBUG nova.compute.provider_tree [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.132 225859 DEBUG nova.scheduler.client.report [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.151 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.152 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.191 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.191 225859 DEBUG nova.network.neutron [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.211 225859 INFO nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.230 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.357 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.359 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.359 225859 INFO nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Creating image(s)#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.387 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.412 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.612 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.617 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:34:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:34:58.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.671 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.672 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.673 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.673 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.779 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.783 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 680a9e49-0486-46a0-8857-99a7a56c46e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.840 225859 DEBUG nova.network.neutron [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:34:58 np0005588919 nova_compute[225855]: 2026-01-20 14:34:58.840 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.081 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 680a9e49-0486-46a0-8857-99a7a56c46e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.163 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] resizing rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.340 225859 DEBUG nova.objects.instance [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lazy-loading 'migration_context' on Instance uuid 680a9e49-0486-46a0-8857-99a7a56c46e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.358 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.359 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Ensure instance console log exists: /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.360 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.361 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.361 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.364 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.370 225859 WARNING nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.377 225859 DEBUG nova.virt.libvirt.host [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.379 225859 DEBUG nova.virt.libvirt.host [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.384 225859 DEBUG nova.virt.libvirt.host [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.385 225859 DEBUG nova.virt.libvirt.host [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.387 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.388 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.389 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.389 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.390 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.390 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.391 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.391 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.392 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.393 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.393 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.394 225859 DEBUG nova.virt.hardware [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.398 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:34:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:34:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:34:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:34:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.538 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:34:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:34:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2891658671' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.830 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.855 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:34:59 np0005588919 nova_compute[225855]: 2026-01-20 14:34:59.859 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:35:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:35:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/38576092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.321 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.323 225859 DEBUG nova.objects.instance [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lazy-loading 'pci_devices' on Instance uuid 680a9e49-0486-46a0-8857-99a7a56c46e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.342 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <uuid>680a9e49-0486-46a0-8857-99a7a56c46e1</uuid>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <name>instance-0000002e</name>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1834946582</nova:name>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:34:59</nova:creationTime>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:user uuid="72ad8e217e1348378596753eefca1452">tempest-ListImageFiltersTestJSON-1649594432-project-member</nova:user>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <nova:project uuid="9e10f687e8a14fc3bfa98df19df5befd">tempest-ListImageFiltersTestJSON-1649594432</nova:project>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="serial">680a9e49-0486-46a0-8857-99a7a56c46e1</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="uuid">680a9e49-0486-46a0-8857-99a7a56c46e1</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/680a9e49-0486-46a0-8857-99a7a56c46e1_disk">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/console.log" append="off"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:35:00 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:35:00 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:35:00 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:35:00 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.469 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.469 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.469 225859 INFO nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Using config drive#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.492 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:00.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.775 225859 INFO nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Creating config drive at /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.780 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq21yowwm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.909 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq21yowwm" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.937 225859 DEBUG nova.storage.rbd_utils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] rbd image 680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:00 np0005588919 nova_compute[225855]: 2026-01-20 14:35:00.941 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config 680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.134 225859 DEBUG oslo_concurrency.processutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config 680a9e49-0486-46a0-8857-99a7a56c46e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.135 225859 INFO nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Deleting local config drive /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1/disk.config because it was imported into RBD.#033[00m
Jan 20 09:35:01 np0005588919 systemd-machined[194361]: New machine qemu-23-instance-0000002e.
Jan 20 09:35:01 np0005588919 systemd[1]: Started Virtual Machine qemu-23-instance-0000002e.
Jan 20 09:35:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.617 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919701.6168437, 680a9e49-0486-46a0-8857-99a7a56c46e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.617 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.621 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.621 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.624 225859 INFO nova.virt.libvirt.driver [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance spawned successfully.#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.625 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.643 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.647 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.651 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.652 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.652 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.653 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.653 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.654 225859 DEBUG nova.virt.libvirt.driver [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.676 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.676 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919701.6206298, 680a9e49-0486-46a0-8857-99a7a56c46e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.677 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] VM Started (Lifecycle Event)#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.701 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.705 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.710 225859 INFO nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 3.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.711 225859 DEBUG nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.742 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.791 225859 INFO nova.compute.manager [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 4.33 seconds to build instance.#033[00m
Jan 20 09:35:01 np0005588919 nova_compute[225855]: 2026-01-20 14:35:01.811 225859 DEBUG oslo_concurrency.lockutils [None req-d25f88da-e047-4e24-89f2-d9e5dffd2790 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 20 09:35:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:02.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:03.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:03 np0005588919 nova_compute[225855]: 2026-01-20 14:35:03.805 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:04 np0005588919 nova_compute[225855]: 2026-01-20 14:35:04.541 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:04.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:05Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:35:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:05Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:35:05 np0005588919 nova_compute[225855]: 2026-01-20 14:35:05.230 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:35:05 np0005588919 nova_compute[225855]: 2026-01-20 14:35:05.334 225859 DEBUG nova.compute.manager [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:05 np0005588919 nova_compute[225855]: 2026-01-20 14:35:05.380 225859 INFO nova.compute.manager [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] instance snapshotting#033[00m
Jan 20 09:35:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:05.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:05 np0005588919 nova_compute[225855]: 2026-01-20 14:35:05.677 225859 INFO nova.virt.libvirt.driver [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Beginning live snapshot process#033[00m
Jan 20 09:35:05 np0005588919 nova_compute[225855]: 2026-01-20 14:35:05.832 225859 DEBUG nova.virt.libvirt.imagebackend [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:35:06 np0005588919 nova_compute[225855]: 2026-01-20 14:35:06.143 225859 DEBUG nova.storage.rbd_utils [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] creating snapshot(e99d7a396ba9481c94fa6ca492217c0c) on rbd image(680a9e49-0486-46a0-8857-99a7a56c46e1_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:06.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 20 09:35:06 np0005588919 nova_compute[225855]: 2026-01-20 14:35:06.784 225859 DEBUG nova.storage.rbd_utils [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] cloning vms/680a9e49-0486-46a0-8857-99a7a56c46e1_disk@e99d7a396ba9481c94fa6ca492217c0c to images/70955243-a059-4d15-b65b-03ec50f95c21 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:35:06 np0005588919 nova_compute[225855]: 2026-01-20 14:35:06.936 225859 DEBUG nova.storage.rbd_utils [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] flattening images/70955243-a059-4d15-b65b-03ec50f95c21 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.210 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.212 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.211 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:07.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.504 225859 DEBUG nova.storage.rbd_utils [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] removing snapshot(e99d7a396ba9481c94fa6ca492217c0c) on rbd image(680a9e49-0486-46a0-8857-99a7a56c46e1_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:35:07 np0005588919 kernel: tap73e232f9-38 (unregistering): left promiscuous mode
Jan 20 09:35:07 np0005588919 NetworkManager[49104]: <info>  [1768919707.5139] device (tap73e232f9-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:07Z|00153|binding|INFO|Releasing lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f from this chassis (sb_readonly=0)
Jan 20 09:35:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:07Z|00154|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f down in Southbound
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.522 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:07Z|00155|binding|INFO|Removing iface tap73e232f9-38 ovn-installed in OVS
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.524 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.528 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.530 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a unbound from our chassis#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.531 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.539 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.552 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6089a0bd-8d78-4514-8e8b-1881575ad308]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.576 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[489c76ed-5397-4375-989f-386e2ec0913e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.579 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c212f49a-6b9a-4048-8aa4-0bdfabdf8bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 20 09:35:07 np0005588919 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000027.scope: Consumed 14.305s CPU time.
Jan 20 09:35:07 np0005588919 systemd-machined[194361]: Machine qemu-22-instance-00000027 terminated.
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.606 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[05367c25-a7dd-4856-9951-3e1e537181bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.621 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a6894c2f-536c-4299-8e08-4d649c201b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246342, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.636 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f14451-1fff-44a3-9fcb-7d50162e8953]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246343, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246343, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.637 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.642 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.642 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.643 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:07.643 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.721 225859 DEBUG nova.compute.manager [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.721 225859 DEBUG oslo_concurrency.lockutils [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.721 225859 DEBUG oslo_concurrency.lockutils [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.721 225859 DEBUG oslo_concurrency.lockutils [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.722 225859 DEBUG nova.compute.manager [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.722 225859 WARNING nova.compute.manager [req-5429dd98-3169-4734-8738-233da46ad54c req-a24801e7-8a32-4b74-b90f-7b1698d75453 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state rebuilding.#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.757 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:07 np0005588919 nova_compute[225855]: 2026-01-20 14:35:07.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.243 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.250 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance destroyed successfully.#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.256 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance destroyed successfully.#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.261 225859 DEBUG nova.virt.libvirt.vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:34:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:34:54Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.262 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.264 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.268 225859 DEBUG os_vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.277 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.279 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73e232f9-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.290 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:08 np0005588919 nova_compute[225855]: 2026-01-20 14:35:08.295 225859 INFO os_vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:35:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:08.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.029 225859 DEBUG nova.storage.rbd_utils [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] creating snapshot(snap) on rbd image(70955243-a059-4d15-b65b-03ec50f95c21) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:09.214 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.336 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting instance files /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.338 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deletion of /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del complete#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.493 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.493 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating image(s)#033[00m
Jan 20 09:35:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:09.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.528 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.560 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.594 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.601 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.631 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.681 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.682 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.684 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.685 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.716 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.721 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.859 225859 DEBUG nova.compute.manager [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.861 225859 DEBUG oslo_concurrency.lockutils [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.862 225859 DEBUG oslo_concurrency.lockutils [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.862 225859 DEBUG oslo_concurrency.lockutils [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.862 225859 DEBUG nova.compute.manager [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:09 np0005588919 nova_compute[225855]: 2026-01-20 14:35:09.862 225859 WARNING nova.compute.manager [req-1bba3f56-9e7f-438d-81f2-50dd564fd169 req-0f2f8756-eecb-4c54-98ea-83e847d243d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 09:35:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.049 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.154 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] resizing rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.278 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.278 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Ensure instance console log exists: /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.279 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.279 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.280 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.282 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start _get_guest_xml network_info=[{"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.287 225859 WARNING nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.312 225859 DEBUG nova.virt.libvirt.host [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.312 225859 DEBUG nova.virt.libvirt.host [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.317 225859 DEBUG nova.virt.libvirt.host [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.318 225859 DEBUG nova.virt.libvirt.host [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.319 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.319 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.320 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.320 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.320 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.321 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.321 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.321 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.321 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.322 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.322 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.322 225859 DEBUG nova.virt.hardware [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.323 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.368 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:10.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1346870109' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.838 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.867 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:10 np0005588919 nova_compute[225855]: 2026-01-20 14:35:10.872 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:11 np0005588919 podman[246598]: 2026-01-20 14:35:11.072177958 +0000 UTC m=+0.120351578 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:35:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:35:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2312094919' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.328 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.331 225859 DEBUG nova.virt.libvirt.vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:34:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:35:09Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.331 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.332 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.336 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <uuid>2ec7b07d-b593-46b7-9751-b6116e4d2cec</uuid>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <name>instance-00000027</name>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersAdminTestJSON-server-1907009380</nova:name>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:35:10</nova:creationTime>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:user uuid="f51c395107c84dbd9067113b84ff01dd">tempest-ServersAdminTestJSON-1261404595-project-member</nova:user>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:project uuid="a841e7a1434c488390475174e10bc161">tempest-ServersAdminTestJSON-1261404595</nova:project>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <nova:port uuid="73e232f9-3860-4b9a-9cec-535fa2fb0c9f">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="serial">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="uuid">2ec7b07d-b593-46b7-9751-b6116e4d2cec</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:17:6a:15"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <target dev="tap73e232f9-38"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/console.log" append="off"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:35:11 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:35:11 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:35:11 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:35:11 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.344 225859 DEBUG nova.virt.libvirt.vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:34:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:35:09Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.344 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.345 225859 DEBUG nova.network.os_vif_util [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.346 225859 DEBUG os_vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.347 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.348 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.348 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.351 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.351 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73e232f9-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.352 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73e232f9-38, col_values=(('external_ids', {'iface-id': '73e232f9-3860-4b9a-9cec-535fa2fb0c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:6a:15', 'vm-uuid': '2ec7b07d-b593-46b7-9751-b6116e4d2cec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.353 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:11 np0005588919 NetworkManager[49104]: <info>  [1768919711.3543] manager: (tap73e232f9-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.358 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.359 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.360 225859 INFO os_vif [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.364 225859 INFO nova.virt.libvirt.driver [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Snapshot image upload complete#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.364 225859 INFO nova.compute.manager [None req-20e68329-0731-4f9d-a2b5-67fd5d6960b8 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 5.98 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.432 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.433 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.434 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] No VIF found with MAC fa:16:3e:17:6a:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.434 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Using config drive#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.461 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.477 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:11.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:11 np0005588919 nova_compute[225855]: 2026-01-20 14:35:11.508 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'keypairs' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.092 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Creating config drive at /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.098 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2dbsfxiu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.246 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2dbsfxiu" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.287 225859 DEBUG nova.storage.rbd_utils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] rbd image 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.292 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.487 225859 DEBUG oslo_concurrency.processutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config 2ec7b07d-b593-46b7-9751-b6116e4d2cec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.488 225859 INFO nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting local config drive /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec/disk.config because it was imported into RBD.#033[00m
Jan 20 09:35:12 np0005588919 kernel: tap73e232f9-38: entered promiscuous mode
Jan 20 09:35:12 np0005588919 NetworkManager[49104]: <info>  [1768919712.5564] manager: (tap73e232f9-38): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 20 09:35:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:12Z|00156|binding|INFO|Claiming lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f for this chassis.
Jan 20 09:35:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:12Z|00157|binding|INFO|73e232f9-3860-4b9a-9cec-535fa2fb0c9f: Claiming fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.559 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.567 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.570 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a bound to our chassis#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.573 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:35:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:12Z|00158|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f ovn-installed in OVS
Jan 20 09:35:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:12Z|00159|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f up in Southbound
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.603 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.605 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6fe1a0-4d8f-4ab2-8695-0046580916ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 systemd-machined[194361]: New machine qemu-24-instance-00000027.
Jan 20 09:35:12 np0005588919 systemd-udevd[246722]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:35:12 np0005588919 systemd[1]: Started Virtual Machine qemu-24-instance-00000027.
Jan 20 09:35:12 np0005588919 NetworkManager[49104]: <info>  [1768919712.6449] device (tap73e232f9-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:35:12 np0005588919 NetworkManager[49104]: <info>  [1768919712.6455] device (tap73e232f9-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.657 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f300e6ac-5de0-4ce6-9927-68bba72a358d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.661 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fb90b7-e1c8-4742-91cf-9eccddfbaa54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:12.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.699 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1043de85-5640-4753-abf8-5421be5a8c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.763 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5a779436-d0bd-4367-876a-32422ec9d81a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246733, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.782 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[97ddac91-a36e-4dca-ba25-f70c3524dd11]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246735, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246735, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.784 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:12 np0005588919 nova_compute[225855]: 2026-01-20 14:35:12.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.790 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:12.790 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.074 225859 DEBUG nova.compute.manager [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.075 225859 DEBUG oslo_concurrency.lockutils [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.075 225859 DEBUG oslo_concurrency.lockutils [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.076 225859 DEBUG oslo_concurrency.lockutils [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.076 225859 DEBUG nova.compute.manager [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.077 225859 WARNING nova.compute.manager [req-2ae57ba0-92b4-4474-ba32-94190499f283 req-fb00b422-9f35-4267-9b8a-b530fe2e4597 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 09:35:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:13.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.607 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 2ec7b07d-b593-46b7-9751-b6116e4d2cec due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.607 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919713.6066756, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.608 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.610 225859 DEBUG nova.compute.manager [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.610 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.614 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance spawned successfully.#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.614 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.631 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.639 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.640 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.641 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.641 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.641 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.642 225859 DEBUG nova.virt.libvirt.driver [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.647 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.685 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.686 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919713.6102376, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.686 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Started (Lifecycle Event)#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.711 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.714 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.722 225859 DEBUG nova.compute.manager [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.731 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.776 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.777 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.777 225859 DEBUG nova.objects.instance [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:35:13 np0005588919 nova_compute[225855]: 2026-01-20 14:35:13.847 225859 DEBUG oslo_concurrency.lockutils [None req-85a922c4-6375-4395-a9cf-d7257192f2c5 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:14 np0005588919 nova_compute[225855]: 2026-01-20 14:35:14.570 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:14.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.191 225859 DEBUG nova.compute.manager [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.192 225859 DEBUG oslo_concurrency.lockutils [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.192 225859 DEBUG oslo_concurrency.lockutils [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.192 225859 DEBUG oslo_concurrency.lockutils [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.192 225859 DEBUG nova.compute.manager [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:15 np0005588919 nova_compute[225855]: 2026-01-20 14:35:15.192 225859 WARNING nova.compute.manager [req-648a3480-a7a3-4db4-b73d-4b7f6b33b956 req-8c3597ac-22aa-4679-a8fe-f0c922e8b3b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state active and task_state None.#033[00m
Jan 20 09:35:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:15.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:16 np0005588919 nova_compute[225855]: 2026-01-20 14:35:16.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:16.392 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:16.393 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:16.393 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:16.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 20 09:35:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:17.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 20 09:35:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:18.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:19.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:19 np0005588919 nova_compute[225855]: 2026-01-20 14:35:19.573 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:20 np0005588919 nova_compute[225855]: 2026-01-20 14:35:20.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:20 np0005588919 nova_compute[225855]: 2026-01-20 14:35:20.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:35:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:20.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:21 np0005588919 podman[246782]: 2026-01-20 14:35:21.027651737 +0000 UTC m=+0.067810304 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.357 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.516 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.516 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.516 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.517 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.517 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.518 225859 INFO nova.compute.manager [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Terminating instance#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.519 225859 DEBUG nova.compute.manager [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:21.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:21 np0005588919 kernel: tap87a0a5ba-64 (unregistering): left promiscuous mode
Jan 20 09:35:21 np0005588919 NetworkManager[49104]: <info>  [1768919721.5739] device (tap87a0a5ba-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:21Z|00160|binding|INFO|Releasing lport 87a0a5ba-6446-4265-8ada-94d1bd815aed from this chassis (sb_readonly=0)
Jan 20 09:35:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:21Z|00161|binding|INFO|Setting lport 87a0a5ba-6446-4265-8ada-94d1bd815aed down in Southbound
Jan 20 09:35:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:21Z|00162|binding|INFO|Removing iface tap87a0a5ba-64 ovn-installed in OVS
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.584 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.599 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.614 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:1f:46 10.100.0.10'], port_security=['fa:16:3e:70:1f:46 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fdba30ff-e02a-4857-92f6-1828ce3ab175', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=87a0a5ba-6446-4265-8ada-94d1bd815aed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.615 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 87a0a5ba-6446-4265-8ada-94d1bd815aed in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a unbound from our chassis#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.617 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.630 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d2848a-34af-4663-851c-1ab4bdacbbe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 20 09:35:21 np0005588919 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002a.scope: Consumed 15.401s CPU time.
Jan 20 09:35:21 np0005588919 systemd-machined[194361]: Machine qemu-21-instance-0000002a terminated.
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.659 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a106108d-454b-40c8-8ca4-5c97073baca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.662 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5189f3f5-2926-4a6f-ab69-cc79ebb485a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.692 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f46c9af9-a5cc-4297-824d-6d629f46390d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.709 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e0df43db-303c-4eb5-9939-86e06a2974e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap33c9a20a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:8e:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466055, 'reachable_time': 29489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246814, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.724 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a94a23-404e-4a36-83ae-e0e72ffab2f5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466065, 'tstamp': 466065}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246815, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap33c9a20a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466068, 'tstamp': 466068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246815, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.725 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.771 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33c9a20a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.771 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.772 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap33c9a20a-d0, col_values=(('external_ids', {'iface-id': '90c69687-c788-4dba-881f-3ed4a5ee6007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:21.772 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.782 225859 INFO nova.virt.libvirt.driver [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Instance destroyed successfully.#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.783 225859 DEBUG nova.objects.instance [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'resources' on Instance uuid fdba30ff-e02a-4857-92f6-1828ce3ab175 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.799 225859 DEBUG nova.virt.libvirt.vif [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1832306325',display_name='tempest-ServersAdminTestJSON-server-1832306325',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1832306325',id=42,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:34:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-sb3w0f0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:34:15Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=fdba30ff-e02a-4857-92f6-1828ce3ab175,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.800 225859 DEBUG nova.network.os_vif_util [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "address": "fa:16:3e:70:1f:46", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87a0a5ba-64", "ovs_interfaceid": "87a0a5ba-6446-4265-8ada-94d1bd815aed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.800 225859 DEBUG nova.network.os_vif_util [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.801 225859 DEBUG os_vif [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.803 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87a0a5ba-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.807 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:21 np0005588919 nova_compute[225855]: 2026-01-20 14:35:21.810 225859 INFO os_vif [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:1f:46,bridge_name='br-int',has_traffic_filtering=True,id=87a0a5ba-6446-4265-8ada-94d1bd815aed,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87a0a5ba-64')#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.155 225859 DEBUG nova.compute.manager [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.198 225859 INFO nova.compute.manager [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] instance snapshotting#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.272 225859 INFO nova.virt.libvirt.driver [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Deleting instance files /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175_del#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.273 225859 INFO nova.virt.libvirt.driver [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Deletion of /var/lib/nova/instances/fdba30ff-e02a-4857-92f6-1828ce3ab175_del complete#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.342 225859 INFO nova.compute.manager [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Took 0.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.342 225859 DEBUG oslo.service.loopingcall [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.343 225859 DEBUG nova.compute.manager [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.343 225859 DEBUG nova.network.neutron [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.376 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.377 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.377 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.377 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.398 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.399 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.399 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.399 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.400 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.470 225859 INFO nova.virt.libvirt.driver [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Beginning live snapshot process#033[00m
Jan 20 09:35:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.623 225859 DEBUG nova.virt.libvirt.imagebackend [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:35:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:22.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.808 225859 DEBUG nova.storage.rbd_utils [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] creating snapshot(336cafe49ed3471a888416cc0350ffb9) on rbd image(680a9e49-0486-46a0-8857-99a7a56c46e1_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/832753989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.899 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.992 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000002e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.994 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000002e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.998 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:22 np0005588919 nova_compute[225855]: 2026-01-20 14:35:22.999 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000027 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.172 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.173 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.703880310058594GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.174 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.174 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.287 225859 DEBUG nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-unplugged-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.287 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.288 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.288 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.288 225859 DEBUG nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] No waiting events found dispatching network-vif-unplugged-87a0a5ba-6446-4265-8ada-94d1bd815aed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.289 225859 DEBUG nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-unplugged-87a0a5ba-6446-4265-8ada-94d1bd815aed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.289 225859 DEBUG nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.289 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.290 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.290 225859 DEBUG oslo_concurrency.lockutils [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.290 225859 DEBUG nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] No waiting events found dispatching network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.290 225859 WARNING nova.compute.manager [req-cff80749-54f6-42a9-84e1-a206c503f0bc req-562abf7a-5d07-4765-8066-4a8f96c33af0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received unexpected event network-vif-plugged-87a0a5ba-6446-4265-8ada-94d1bd815aed for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.315 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 2ec7b07d-b593-46b7-9751-b6116e4d2cec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.315 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance fdba30ff-e02a-4857-92f6-1828ce3ab175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.316 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 680a9e49-0486-46a0-8857-99a7a56c46e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.316 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.316 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.410 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.482 225859 DEBUG nova.network.neutron [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.499 225859 INFO nova.compute.manager [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Took 1.16 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:23.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.535 225859 DEBUG nova.compute.manager [req-0b0ea41c-d08a-459c-b021-13256e7963b3 req-8f390230-be26-4bac-988f-71dbef7b8e7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Received event network-vif-deleted-87a0a5ba-6446-4265-8ada-94d1bd815aed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.545 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.602 225859 DEBUG nova.storage.rbd_utils [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] cloning vms/680a9e49-0486-46a0-8857-99a7a56c46e1_disk@336cafe49ed3471a888416cc0350ffb9 to images/606f74dc-79ac-4b4e-b154-695b258203bd clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.727 225859 DEBUG nova.storage.rbd_utils [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] flattening images/606f74dc-79ac-4b4e-b154-695b258203bd flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:35:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3777007206' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.852 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.858 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.888 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.919 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.920 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:23 np0005588919 nova_compute[225855]: 2026-01-20 14:35:23.920 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.015 225859 DEBUG oslo_concurrency.processutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.126 225859 DEBUG nova.storage.rbd_utils [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] removing snapshot(336cafe49ed3471a888416cc0350ffb9) on rbd image(680a9e49-0486-46a0-8857-99a7a56c46e1_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:35:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/685615446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.487 225859 DEBUG oslo_concurrency.processutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.496 225859 DEBUG nova.compute.provider_tree [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.512 225859 DEBUG nova.scheduler.client.report [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.540 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.570 225859 INFO nova.scheduler.client.report [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Deleted allocations for instance fdba30ff-e02a-4857-92f6-1828ce3ab175#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.632 225859 DEBUG oslo_concurrency.lockutils [None req-818c50c4-732e-447e-be1c-00273f386d27 f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "fdba30ff-e02a-4857-92f6-1828ce3ab175" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:24.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 20 09:35:24 np0005588919 nova_compute[225855]: 2026-01-20 14:35:24.953 225859 DEBUG nova.storage.rbd_utils [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] creating snapshot(snap) on rbd image(606f74dc-79ac-4b4e-b154-695b258203bd) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:35:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:25 np0005588919 nova_compute[225855]: 2026-01-20 14:35:25.884 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:25 np0005588919 nova_compute[225855]: 2026-01-20 14:35:25.884 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 20 09:35:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:26Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:35:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:26Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:6a:15 10.100.0.6
Jan 20 09:35:26 np0005588919 nova_compute[225855]: 2026-01-20 14:35:26.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:26 np0005588919 nova_compute[225855]: 2026-01-20 14:35:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:35:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:26.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:26 np0005588919 nova_compute[225855]: 2026-01-20 14:35:26.817 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:27 np0005588919 nova_compute[225855]: 2026-01-20 14:35:27.463 225859 INFO nova.virt.libvirt.driver [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Snapshot image upload complete#033[00m
Jan 20 09:35:27 np0005588919 nova_compute[225855]: 2026-01-20 14:35:27.464 225859 INFO nova.compute.manager [None req-101774a7-9f44-44b5-b6ce-5de2fe11cb61 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 5.26 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:35:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:27.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:28.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.367 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.367 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.367 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.368 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.368 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.369 225859 INFO nova.compute.manager [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Terminating instance#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.370 225859 DEBUG nova.compute.manager [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:29 np0005588919 kernel: tap73e232f9-38 (unregistering): left promiscuous mode
Jan 20 09:35:29 np0005588919 NetworkManager[49104]: <info>  [1768919729.4358] device (tap73e232f9-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.445 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:29Z|00163|binding|INFO|Releasing lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f from this chassis (sb_readonly=0)
Jan 20 09:35:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:29Z|00164|binding|INFO|Setting lport 73e232f9-3860-4b9a-9cec-535fa2fb0c9f down in Southbound
Jan 20 09:35:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:35:29Z|00165|binding|INFO|Removing iface tap73e232f9-38 ovn-installed in OVS
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.448 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.452 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:6a:15 10.100.0.6'], port_security=['fa:16:3e:17:6a:15 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2ec7b07d-b593-46b7-9751-b6116e4d2cec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a841e7a1434c488390475174e10bc161', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0bbdea05-fba7-47c7-ba4e-5dac58212a25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43dea88-ea55-4069-a4be-2c30a432a754, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73e232f9-3860-4b9a-9cec-535fa2fb0c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.454 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73e232f9-3860-4b9a-9cec-535fa2fb0c9f in datapath 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a unbound from our chassis#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.455 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.457 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[46e86d9c-60e4-469e-a549-feb877e22561]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.457 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a namespace which is not needed anymore#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 20 09:35:29 np0005588919 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000027.scope: Consumed 13.575s CPU time.
Jan 20 09:35:29 np0005588919 systemd-machined[194361]: Machine qemu-24-instance-00000027 terminated.
Jan 20 09:35:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:29.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.580 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.594 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.606 225859 INFO nova.virt.libvirt.driver [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Instance destroyed successfully.#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.607 225859 DEBUG nova.objects.instance [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lazy-loading 'resources' on Instance uuid 2ec7b07d-b593-46b7-9751-b6116e4d2cec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [NOTICE]   (244356) : haproxy version is 2.8.14-c23fe91
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [NOTICE]   (244356) : path to executable is /usr/sbin/haproxy
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [WARNING]  (244356) : Exiting Master process...
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [WARNING]  (244356) : Exiting Master process...
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [ALERT]    (244356) : Current worker (244358) exited with code 143 (Terminated)
Jan 20 09:35:29 np0005588919 neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a[244351]: [WARNING]  (244356) : All workers exited. Exiting... (0)
Jan 20 09:35:29 np0005588919 systemd[1]: libpod-ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d.scope: Deactivated successfully.
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.620 225859 DEBUG nova.virt.libvirt.vif [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:33:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1907009380',display_name='tempest-ServersAdminTestJSON-server-1907009380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1907009380',id=39,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:35:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a841e7a1434c488390475174e10bc161',ramdisk_id='',reservation_id='r-8k5b63bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1261404595',owner_user_name='tempest-ServersAdminTestJSON-1261404595-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:35:16Z,user_data=None,user_id='f51c395107c84dbd9067113b84ff01dd',uuid=2ec7b07d-b593-46b7-9751-b6116e4d2cec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.621 225859 DEBUG nova.network.os_vif_util [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converting VIF {"id": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "address": "fa:16:3e:17:6a:15", "network": {"id": "33c9a20a-d976-42a8-b8bf-f83ddfc97c9a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-202342440-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a841e7a1434c488390475174e10bc161", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e232f9-38", "ovs_interfaceid": "73e232f9-3860-4b9a-9cec-535fa2fb0c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:35:29 np0005588919 podman[247132]: 2026-01-20 14:35:29.621629097 +0000 UTC m=+0.058375389 container died ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.622 225859 DEBUG nova.network.os_vif_util [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.623 225859 DEBUG os_vif [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.625 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.625 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73e232f9-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.628 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.632 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.633 225859 INFO os_vif [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:6a:15,bridge_name='br-int',has_traffic_filtering=True,id=73e232f9-3860-4b9a-9cec-535fa2fb0c9f,network=Network(33c9a20a-d976-42a8-b8bf-f83ddfc97c9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e232f9-38')#033[00m
Jan 20 09:35:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d-userdata-shm.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay-61f71f999ebe55922d4470c26bf6ec7028f2091bfc297c60f9663a1040a21c70-merged.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588919 podman[247132]: 2026-01-20 14:35:29.677254487 +0000 UTC m=+0.114000779 container cleanup ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:35:29 np0005588919 systemd[1]: libpod-conmon-ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d.scope: Deactivated successfully.
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.686 225859 DEBUG nova.compute.manager [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.686 225859 DEBUG oslo_concurrency.lockutils [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.686 225859 DEBUG oslo_concurrency.lockutils [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.686 225859 DEBUG oslo_concurrency.lockutils [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.687 225859 DEBUG nova.compute.manager [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.687 225859 DEBUG nova.compute.manager [req-2478d1ca-616a-4a9f-8baf-f0115f84292d req-8d128d77-2b1f-4992-848c-204bb58fce0a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-unplugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:35:29 np0005588919 podman[247187]: 2026-01-20 14:35:29.74006577 +0000 UTC m=+0.042363187 container remove ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.747 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d1bc923d-4dce-4577-a0e4-008fa21eb39c]: (4, ('Tue Jan 20 02:35:29 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a (ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d)\nab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d\nTue Jan 20 02:35:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a (ab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d)\nab810e99d7949df31ae8ecec1ce4cb6e08341183c61e4c0ebc73d5a2b21ea83d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.748 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[17df3e05-32af-4710-ae51-1fd1710ba162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.749 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33c9a20a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:35:29 np0005588919 kernel: tap33c9a20a-d0: left promiscuous mode
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.751 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 nova_compute[225855]: 2026-01-20 14:35:29.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.768 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8713dca9-eb06-44c2-a4f6-13b6e74995bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.788 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b5582c61-20f7-4993-a661-f54d12ca6af6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.789 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[283fbf91-f3f4-4461-bb0d-009c3bb6bd48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.808 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b63a1dfd-14ab-409e-af6e-6591abf9dcfa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466048, 'reachable_time': 17265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247202, 'error': None, 'target': 'ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:29 np0005588919 systemd[1]: run-netns-ovnmeta\x2d33c9a20a\x2dd976\x2d42a8\x2db8bf\x2df83ddfc97c9a.mount: Deactivated successfully.
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.811 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-33c9a20a-d976-42a8-b8bf-f83ddfc97c9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:35:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:29.811 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6184e2-a0ec-4d78-87c8-9bbca2e0b207]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:35:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.117 225859 INFO nova.virt.libvirt.driver [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deleting instance files /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del#033[00m
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.117 225859 INFO nova.virt.libvirt.driver [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deletion of /var/lib/nova/instances/2ec7b07d-b593-46b7-9751-b6116e4d2cec_del complete#033[00m
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.165 225859 INFO nova.compute.manager [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.166 225859 DEBUG oslo.service.loopingcall [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.166 225859 DEBUG nova.compute.manager [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:30 np0005588919 nova_compute[225855]: 2026-01-20 14:35:30.166 225859 DEBUG nova.network.neutron [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:30.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.602 225859 DEBUG nova.network.neutron [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.623 225859 INFO nova.compute.manager [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Took 1.46 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.665 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.665 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.725 225859 DEBUG oslo_concurrency.processutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.798 225859 DEBUG nova.compute.manager [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.799 225859 DEBUG oslo_concurrency.lockutils [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.801 225859 DEBUG oslo_concurrency.lockutils [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.801 225859 DEBUG oslo_concurrency.lockutils [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.802 225859 DEBUG nova.compute.manager [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] No waiting events found dispatching network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.802 225859 WARNING nova.compute.manager [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received unexpected event network-vif-plugged-73e232f9-3860-4b9a-9cec-535fa2fb0c9f for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:35:31 np0005588919 nova_compute[225855]: 2026-01-20 14:35:31.803 225859 DEBUG nova.compute.manager [req-1d4859ea-c912-4cf8-b487-337fbe4305ad req-66f290ba-03cd-43bd-ba73-500119eb195f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Received event network-vif-deleted-73e232f9-3860-4b9a-9cec-535fa2fb0c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:35:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1217843012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.183 225859 DEBUG oslo_concurrency.processutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.190 225859 DEBUG nova.compute.provider_tree [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.202 225859 DEBUG nova.scheduler.client.report [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.226 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.256 225859 INFO nova.scheduler.client.report [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Deleted allocations for instance 2ec7b07d-b593-46b7-9751-b6116e4d2cec#033[00m
Jan 20 09:35:32 np0005588919 nova_compute[225855]: 2026-01-20 14:35:32.324 225859 DEBUG oslo_concurrency.lockutils [None req-d3e49ae9-5b42-4e29-8be6-815fdb92fbdb f51c395107c84dbd9067113b84ff01dd a841e7a1434c488390475174e10bc161 - - default default] Lock "2ec7b07d-b593-46b7-9751-b6116e4d2cec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.957s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 20 09:35:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:33.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 20 09:35:34 np0005588919 nova_compute[225855]: 2026-01-20 14:35:34.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 20 09:35:34 np0005588919 nova_compute[225855]: 2026-01-20 14:35:34.628 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:34.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:35 np0005588919 nova_compute[225855]: 2026-01-20 14:35:35.067 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:35.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 20 09:35:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:35:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:36.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:35:36 np0005588919 nova_compute[225855]: 2026-01-20 14:35:36.781 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919721.7790616, fdba30ff-e02a-4857-92f6-1828ce3ab175 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:36 np0005588919 nova_compute[225855]: 2026-01-20 14:35:36.781 225859 INFO nova.compute.manager [-] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:36 np0005588919 nova_compute[225855]: 2026-01-20 14:35:36.821 225859 DEBUG nova.compute.manager [None req-c9863632-9eb7-41f9-8857-36080c01ae35 - - - - - -] [instance: fdba30ff-e02a-4857-92f6-1828ce3ab175] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 20 09:35:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:37.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 20 09:35:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:38.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 20 09:35:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:39.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:39 np0005588919 nova_compute[225855]: 2026-01-20 14:35:39.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:39 np0005588919 nova_compute[225855]: 2026-01-20 14:35:39.630 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 20 09:35:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:41.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:42 np0005588919 podman[247232]: 2026-01-20 14:35:42.080570797 +0000 UTC m=+0.117558569 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:35:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.385 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "680a9e49-0486-46a0-8857-99a7a56c46e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.386 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.387 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "680a9e49-0486-46a0-8857-99a7a56c46e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.389 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.389 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.391 225859 INFO nova.compute.manager [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Terminating instance#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.394 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "refresh_cache-680a9e49-0486-46a0-8857-99a7a56c46e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.394 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquired lock "refresh_cache-680a9e49-0486-46a0-8857-99a7a56c46e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.395 225859 DEBUG nova.network.neutron [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:35:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:43.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.594 225859 DEBUG nova.network.neutron [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.888 225859 DEBUG nova.network.neutron [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.904 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Releasing lock "refresh_cache-680a9e49-0486-46a0-8857-99a7a56c46e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:35:43 np0005588919 nova_compute[225855]: 2026-01-20 14:35:43.905 225859 DEBUG nova.compute.manager [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:35:44 np0005588919 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 20 09:35:44 np0005588919 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Consumed 14.256s CPU time.
Jan 20 09:35:44 np0005588919 systemd-machined[194361]: Machine qemu-23-instance-0000002e terminated.
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.332 225859 INFO nova.virt.libvirt.driver [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance destroyed successfully.#033[00m
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.333 225859 DEBUG nova.objects.instance [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lazy-loading 'resources' on Instance uuid 680a9e49-0486-46a0-8857-99a7a56c46e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.605 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919729.6044123, 2ec7b07d-b593-46b7-9751-b6116e4d2cec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.605 225859 INFO nova.compute.manager [-] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.630 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:44 np0005588919 nova_compute[225855]: 2026-01-20 14:35:44.865 225859 DEBUG nova.compute.manager [None req-1294c7fe-eb58-4f5d-9aee-24b8391d9120 - - - - - -] [instance: 2ec7b07d-b593-46b7-9751-b6116e4d2cec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.316 225859 INFO nova.virt.libvirt.driver [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Deleting instance files /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1_del#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.317 225859 INFO nova.virt.libvirt.driver [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Deletion of /var/lib/nova/instances/680a9e49-0486-46a0-8857-99a7a56c46e1_del complete#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.554 225859 INFO nova.compute.manager [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.555 225859 DEBUG oslo.service.loopingcall [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.555 225859 DEBUG nova.compute.manager [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.556 225859 DEBUG nova.network.neutron [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:35:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:45.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.931 225859 DEBUG nova.network.neutron [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.953 225859 DEBUG nova.network.neutron [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:35:45 np0005588919 nova_compute[225855]: 2026-01-20 14:35:45.967 225859 INFO nova.compute.manager [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Took 0.41 seconds to deallocate network for instance.#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.293 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.293 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.365 225859 DEBUG oslo_concurrency.processutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:35:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:46.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:35:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4051494342' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.804 225859 DEBUG oslo_concurrency.processutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.810 225859 DEBUG nova.compute.provider_tree [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.840 225859 DEBUG nova.scheduler.client.report [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:35:46 np0005588919 nova_compute[225855]: 2026-01-20 14:35:46.995 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:47 np0005588919 nova_compute[225855]: 2026-01-20 14:35:47.041 225859 INFO nova.scheduler.client.report [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Deleted allocations for instance 680a9e49-0486-46a0-8857-99a7a56c46e1#033[00m
Jan 20 09:35:47 np0005588919 nova_compute[225855]: 2026-01-20 14:35:47.208 225859 DEBUG oslo_concurrency.lockutils [None req-b9ab3b0c-4f1e-446f-b6f4-8690c5ef6d3d 72ad8e217e1348378596753eefca1452 9e10f687e8a14fc3bfa98df19df5befd - - default default] Lock "680a9e49-0486-46a0-8857-99a7a56c46e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.558764) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747558890, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2549, "num_deletes": 263, "total_data_size": 5587969, "memory_usage": 5645232, "flush_reason": "Manual Compaction"}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 20 09:35:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:47.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747627484, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3669107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30811, "largest_seqno": 33355, "table_properties": {"data_size": 3658662, "index_size": 6683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22357, "raw_average_key_size": 21, "raw_value_size": 3637541, "raw_average_value_size": 3434, "num_data_blocks": 288, "num_entries": 1059, "num_filter_entries": 1059, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919562, "oldest_key_time": 1768919562, "file_creation_time": 1768919747, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 68772 microseconds, and 10887 cpu microseconds.
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.627532) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3669107 bytes OK
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.627552) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.636896) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.636933) EVENT_LOG_v1 {"time_micros": 1768919747636924, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.636956) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5576607, prev total WAL file size 5578306, number of live WAL files 2.
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.638842) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3583KB)], [60(8254KB)]
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747638897, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 12121578, "oldest_snapshot_seqno": -1}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5968 keys, 10180835 bytes, temperature: kUnknown
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747758787, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10180835, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10139950, "index_size": 24839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14981, "raw_key_size": 151248, "raw_average_key_size": 25, "raw_value_size": 10031679, "raw_average_value_size": 1680, "num_data_blocks": 1003, "num_entries": 5968, "num_filter_entries": 5968, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919747, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.759011) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10180835 bytes
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.761512) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 101.1 rd, 84.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.1 +0.0 blob) out(9.7 +0.0 blob), read-write-amplify(6.1) write-amplify(2.8) OK, records in: 6504, records dropped: 536 output_compression: NoCompression
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.761547) EVENT_LOG_v1 {"time_micros": 1768919747761536, "job": 36, "event": "compaction_finished", "compaction_time_micros": 119945, "compaction_time_cpu_micros": 22558, "output_level": 6, "num_output_files": 1, "total_output_size": 10180835, "num_input_records": 6504, "num_output_records": 5968, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747762261, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919747763616, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.638542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.763662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.763668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.763671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.763674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:35:47.763676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:35:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 20 09:35:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:49.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:49 np0005588919 nova_compute[225855]: 2026-01-20 14:35:49.587 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:49 np0005588919 nova_compute[225855]: 2026-01-20 14:35:49.632 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:50.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 20 09:35:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:51.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 20 09:35:52 np0005588919 podman[247358]: 2026-01-20 14:35:52.016728638 +0000 UTC m=+0.055571129 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 09:35:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:52.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:53.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:54 np0005588919 nova_compute[225855]: 2026-01-20 14:35:54.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:54 np0005588919 nova_compute[225855]: 2026-01-20 14:35:54.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:54.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:35:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:55.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:56.049 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:35:56 np0005588919 nova_compute[225855]: 2026-01-20 14:35:56.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:35:56.050 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:35:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:56.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 20 09:35:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:57.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:35:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:35:58.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:35:59 np0005588919 nova_compute[225855]: 2026-01-20 14:35:59.329 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919744.3280523, 680a9e49-0486-46a0-8857-99a7a56c46e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:35:59 np0005588919 nova_compute[225855]: 2026-01-20 14:35:59.330 225859 INFO nova.compute.manager [-] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:35:59 np0005588919 nova_compute[225855]: 2026-01-20 14:35:59.434 225859 DEBUG nova.compute.manager [None req-a808b2ad-3404-4249-9449-f3120c536a7d - - - - - -] [instance: 680a9e49-0486-46a0-8857-99a7a56c46e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:35:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:35:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:35:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:35:59.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:35:59 np0005588919 nova_compute[225855]: 2026-01-20 14:35:59.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:35:59 np0005588919 nova_compute[225855]: 2026-01-20 14:35:59.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:00.052 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:36:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:36:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:00.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.226 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.226 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.244 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.340 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.341 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.346 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.346 225859 INFO nova.compute.claims [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.458 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 20 09:36:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:01.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3097242567' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.930 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.937 225859 DEBUG nova.compute.provider_tree [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.957 225859 DEBUG nova.scheduler.client.report [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.983 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:01 np0005588919 nova_compute[225855]: 2026-01-20 14:36:01.983 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.039 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.039 225859 DEBUG nova.network.neutron [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.059 225859 INFO nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.086 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.173 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.175 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.176 225859 INFO nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Creating image(s)#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.209 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.244 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.277 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.281 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.357 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.358 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.359 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.359 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.389 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.393 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.448 225859 DEBUG nova.network.neutron [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.449 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:36:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.714 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:02.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.785 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] resizing rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.899 225859 DEBUG nova.objects.instance [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'migration_context' on Instance uuid eb82fc99-1632-42b0-90d2-7ce2b9d542a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.924 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.924 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Ensure instance console log exists: /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.925 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.926 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.927 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.929 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:36:02 np0005588919 nova_compute[225855]: 2026-01-20 14:36:02.935 225859 WARNING nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.080 225859 DEBUG nova.virt.libvirt.host [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.082 225859 DEBUG nova.virt.libvirt.host [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.091 225859 DEBUG nova.virt.libvirt.host [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.092 225859 DEBUG nova.virt.libvirt.host [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.095 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.096 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.098 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.098 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.099 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.100 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.100 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.101 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.102 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.102 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.103 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.104 225859 DEBUG nova.virt.hardware [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.109 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/791379787' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.581 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:03.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.614 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:03 np0005588919 nova_compute[225855]: 2026-01-20 14:36:03.618 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 20 09:36:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2849367976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.098 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.099 225859 DEBUG nova.objects.instance [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'pci_devices' on Instance uuid eb82fc99-1632-42b0-90d2-7ce2b9d542a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.115 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <uuid>eb82fc99-1632-42b0-90d2-7ce2b9d542a2</uuid>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <name>instance-00000034</name>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-1842602227</nova:name>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:36:02</nova:creationTime>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:user uuid="ecab37cbd7714ddd81e1db5b37ba85b3">tempest-ServersAdminNegativeTestJSON-1522974762-project-member</nova:user>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <nova:project uuid="b6594bd13c35449abc258d30a1a2509b">tempest-ServersAdminNegativeTestJSON-1522974762</nova:project>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="serial">eb82fc99-1632-42b0-90d2-7ce2b9d542a2</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="uuid">eb82fc99-1632-42b0-90d2-7ce2b9d542a2</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/console.log" append="off"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:36:04 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:36:04 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:36:04 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:36:04 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.173 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.173 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.174 225859 INFO nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Using config drive#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.198 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.390 225859 INFO nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Creating config drive at /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.395 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9wgvap5w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.525 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9wgvap5w" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.560 225859 DEBUG nova.storage.rbd_utils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] rbd image eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.565 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:04 np0005588919 nova_compute[225855]: 2026-01-20 14:36:04.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:04.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.152 225859 DEBUG oslo_concurrency.processutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config eb82fc99-1632-42b0-90d2-7ce2b9d542a2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.153 225859 INFO nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Deleting local config drive /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2/disk.config because it was imported into RBD.#033[00m
Jan 20 09:36:05 np0005588919 systemd-machined[194361]: New machine qemu-25-instance-00000034.
Jan 20 09:36:05 np0005588919 systemd[1]: Started Virtual Machine qemu-25-instance-00000034.
Jan 20 09:36:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:05.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.858 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919765.8583286, eb82fc99-1632-42b0-90d2-7ce2b9d542a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.860 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.864 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.864 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.868 225859 INFO nova.virt.libvirt.driver [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance spawned successfully.#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.868 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.899 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.905 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.906 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.906 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.907 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.908 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.908 225859 DEBUG nova.virt.libvirt.driver [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.913 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.955 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.955 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919765.8631897, eb82fc99-1632-42b0-90d2-7ce2b9d542a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.956 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] VM Started (Lifecycle Event)#033[00m
Jan 20 09:36:05 np0005588919 nova_compute[225855]: 2026-01-20 14:36:05.994 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.000 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.017 225859 INFO nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Took 3.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.018 225859 DEBUG nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.049 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.121 225859 INFO nova.compute.manager [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Took 4.81 seconds to build instance.#033[00m
Jan 20 09:36:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 20 09:36:06 np0005588919 nova_compute[225855]: 2026-01-20 14:36:06.164 225859 DEBUG oslo_concurrency.lockutils [None req-a10aa867-0a22-4aaa-a7f8-bc926ace5b4c ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:06.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:36:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.458 225859 DEBUG nova.objects.instance [None req-8d12183e-a531-4316-927b-0a95364feca9 3d7010bab0db493e8ba3b1a86ad4cf7d 5202cc9c82134fadb20a0003e1f09cf3 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb82fc99-1632-42b0-90d2-7ce2b9d542a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.479 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919767.4789515, eb82fc99-1632-42b0-90d2-7ce2b9d542a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.480 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.508 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.514 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:07 np0005588919 nova_compute[225855]: 2026-01-20 14:36:07.552 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 09:36:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:07.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:08 np0005588919 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 20 09:36:08 np0005588919 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Consumed 2.295s CPU time.
Jan 20 09:36:08 np0005588919 systemd-machined[194361]: Machine qemu-25-instance-00000034 terminated.
Jan 20 09:36:08 np0005588919 nova_compute[225855]: 2026-01-20 14:36:08.307 225859 DEBUG nova.compute.manager [None req-8d12183e-a531-4316-927b-0a95364feca9 3d7010bab0db493e8ba3b1a86ad4cf7d 5202cc9c82134fadb20a0003e1f09cf3 - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 20 09:36:09 np0005588919 nova_compute[225855]: 2026-01-20 14:36:09.597 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:09.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:09 np0005588919 nova_compute[225855]: 2026-01-20 14:36:09.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:10.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:11.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 20 09:36:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:12Z|00166|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 20 09:36:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 20 09:36:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:12.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:13 np0005588919 podman[247994]: 2026-01-20 14:36:13.065685236 +0000 UTC m=+0.107160425 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.481 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.482 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.482 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.482 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.482 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.483 225859 INFO nova.compute.manager [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Terminating instance#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.484 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "refresh_cache-eb82fc99-1632-42b0-90d2-7ce2b9d542a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.484 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquired lock "refresh_cache-eb82fc99-1632-42b0-90d2-7ce2b9d542a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.484 225859 DEBUG nova.network.neutron [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:36:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:36:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/920590240' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:36:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:36:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/920590240' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:36:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:13.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:13 np0005588919 nova_compute[225855]: 2026-01-20 14:36:13.800 225859 DEBUG nova.network.neutron [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.599 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.639 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:14.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.810 225859 DEBUG nova.network.neutron [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.830 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Releasing lock "refresh_cache-eb82fc99-1632-42b0-90d2-7ce2b9d542a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.831 225859 DEBUG nova.compute.manager [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.841 225859 INFO nova.virt.libvirt.driver [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance destroyed successfully.#033[00m
Jan 20 09:36:14 np0005588919 nova_compute[225855]: 2026-01-20 14:36:14.842 225859 DEBUG nova.objects.instance [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lazy-loading 'resources' on Instance uuid eb82fc99-1632-42b0-90d2-7ce2b9d542a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.359 225859 INFO nova.virt.libvirt.driver [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Deleting instance files /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2_del#033[00m
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.360 225859 INFO nova.virt.libvirt.driver [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Deletion of /var/lib/nova/instances/eb82fc99-1632-42b0-90d2-7ce2b9d542a2_del complete#033[00m
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.425 225859 INFO nova.compute.manager [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.426 225859 DEBUG oslo.service.loopingcall [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.426 225859 DEBUG nova.compute.manager [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:36:15 np0005588919 nova_compute[225855]: 2026-01-20 14:36:15.426 225859 DEBUG nova.network.neutron [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:36:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:15.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.112 225859 DEBUG nova.network.neutron [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.128 225859 DEBUG nova.network.neutron [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.145 225859 INFO nova.compute.manager [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Took 0.72 seconds to deallocate network for instance.#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.212 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.212 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.268 225859 DEBUG oslo_concurrency.processutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:16.394 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:16.395 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:16.395 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/717322656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.728 225859 DEBUG oslo_concurrency.processutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.734 225859 DEBUG nova.compute.provider_tree [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.751 225859 DEBUG nova.scheduler.client.report [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:16.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.777 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.821 225859 INFO nova.scheduler.client.report [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Deleted allocations for instance eb82fc99-1632-42b0-90d2-7ce2b9d542a2#033[00m
Jan 20 09:36:16 np0005588919 nova_compute[225855]: 2026-01-20 14:36:16.938 225859 DEBUG oslo_concurrency.lockutils [None req-c61c69e0-76cc-4797-9cf1-d369ea29c6b0 ecab37cbd7714ddd81e1db5b37ba85b3 b6594bd13c35449abc258d30a1a2509b - - default default] Lock "eb82fc99-1632-42b0-90d2-7ce2b9d542a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 20 09:36:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:17.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.691 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.692 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.714 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.783 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.784 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.793 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.794 225859 INFO nova.compute.claims [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:36:17 np0005588919 nova_compute[225855]: 2026-01-20 14:36:17.919 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/169692220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.340 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.345 225859 DEBUG nova.compute.provider_tree [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.398 225859 DEBUG nova.scheduler.client.report [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.522 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.523 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.571 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.572 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.591 225859 INFO nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.615 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.703 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.704 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.704 225859 INFO nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Creating image(s)#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.733 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:18.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.760 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.785 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.789 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.869 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.870 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.871 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.871 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.894 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:18 np0005588919 nova_compute[225855]: 2026-01-20 14:36:18.897 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 59387c9d-df91-4f43-b389-00174486fc84_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:19 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.154 225859 DEBUG nova.policy [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56e2959629114d3d8a48e7a80ed96c4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3750c56415134773aa9d9880038f1749', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.291 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 59387c9d-df91-4f43-b389-00174486fc84_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.343 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] resizing rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.471 225859 DEBUG nova.objects.instance [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'migration_context' on Instance uuid 59387c9d-df91-4f43-b389-00174486fc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.488 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.488 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Ensure instance console log exists: /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.489 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.490 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.490 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:19.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.641 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:19 np0005588919 nova_compute[225855]: 2026-01-20 14:36:19.965 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Successfully created port: f9059531-e6dc-4451-9c17-ec3b63e4b85f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:36:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:20.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:21 np0005588919 nova_compute[225855]: 2026-01-20 14:36:21.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:21.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.181 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Successfully updated port: f9059531-e6dc-4451-9c17-ec3b63e4b85f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.202 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.202 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquired lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.202 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.456 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:36:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.739 225859 DEBUG nova.compute.manager [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Received event network-changed-f9059531-e6dc-4451-9c17-ec3b63e4b85f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.740 225859 DEBUG nova.compute.manager [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Refreshing instance network info cache due to event network-changed-f9059531-e6dc-4451-9c17-ec3b63e4b85f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:36:22 np0005588919 nova_compute[225855]: 2026-01-20 14:36:22.740 225859 DEBUG oslo_concurrency.lockutils [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:36:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:22.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:23 np0005588919 podman[248254]: 2026-01-20 14:36:23.02530031 +0000 UTC m=+0.058952065 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.308 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919768.3072345, eb82fc99-1632-42b0-90d2-7ce2b9d542a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.309 225859 INFO nova.compute.manager [-] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.331 225859 DEBUG nova.compute.manager [None req-69c21bcf-2fd0-42da-9c3e-d33b466a321e - - - - - -] [instance: eb82fc99-1632-42b0-90d2-7ce2b9d542a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:23.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.959 225859 DEBUG nova.network.neutron [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Updating instance_info_cache with network_info: [{"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.985 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Releasing lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.985 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Instance network_info: |[{"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.985 225859 DEBUG oslo_concurrency.lockutils [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.986 225859 DEBUG nova.network.neutron [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Refreshing network info cache for port f9059531-e6dc-4451-9c17-ec3b63e4b85f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.988 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Start _get_guest_xml network_info=[{"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.993 225859 WARNING nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.998 225859 DEBUG nova.virt.libvirt.host [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:36:23 np0005588919 nova_compute[225855]: 2026-01-20 14:36:23.999 225859 DEBUG nova.virt.libvirt.host [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.005 225859 DEBUG nova.virt.libvirt.host [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.005 225859 DEBUG nova.virt.libvirt.host [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.008 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.008 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.009 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.009 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.009 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.009 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.010 225859 DEBUG nova.virt.hardware [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.013 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.364 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.365 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.366 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.367 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.392 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.392 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/240091767' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.449 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.489 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.493 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:24.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3193668944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.831 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:36:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1706595247' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.964 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.966 225859 DEBUG nova.virt.libvirt.vif [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1053628830',display_name='tempest-ImagesTestJSON-server-1053628830',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1053628830',id=53,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-2v7i5wvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:18Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=59387c9d-df91-4f43-b389-00174486fc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.967 225859 DEBUG nova.network.os_vif_util [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.969 225859 DEBUG nova.network.os_vif_util [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.972 225859 DEBUG nova.objects.instance [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'pci_devices' on Instance uuid 59387c9d-df91-4f43-b389-00174486fc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.990 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <uuid>59387c9d-df91-4f43-b389-00174486fc84</uuid>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <name>instance-00000035</name>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:name>tempest-ImagesTestJSON-server-1053628830</nova:name>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:36:23</nova:creationTime>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:user uuid="56e2959629114d3d8a48e7a80ed96c4b">tempest-ImagesTestJSON-338390217-project-member</nova:user>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:project uuid="3750c56415134773aa9d9880038f1749">tempest-ImagesTestJSON-338390217</nova:project>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <nova:port uuid="f9059531-e6dc-4451-9c17-ec3b63e4b85f">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="serial">59387c9d-df91-4f43-b389-00174486fc84</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="uuid">59387c9d-df91-4f43-b389-00174486fc84</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/59387c9d-df91-4f43-b389-00174486fc84_disk">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/59387c9d-df91-4f43-b389-00174486fc84_disk.config">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:5e:a4:97"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <target dev="tapf9059531-e6"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/console.log" append="off"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:36:24 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:36:24 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:36:24 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:36:24 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.992 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Preparing to wait for external event network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.993 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.993 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.993 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.994 225859 DEBUG nova.virt.libvirt.vif [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1053628830',display_name='tempest-ImagesTestJSON-server-1053628830',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1053628830',id=53,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-2v7i5wvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:36:18Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=59387c9d-df91-4f43-b389-00174486fc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.995 225859 DEBUG nova.network.os_vif_util [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.995 225859 DEBUG nova.network.os_vif_util [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.996 225859 DEBUG os_vif [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.997 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.997 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:24 np0005588919 nova_compute[225855]: 2026-01-20 14:36:24.997 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.002 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.002 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9059531-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.003 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9059531-e6, col_values=(('external_ids', {'iface-id': 'f9059531-e6dc-4451-9c17-ec3b63e4b85f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:a4:97', 'vm-uuid': '59387c9d-df91-4f43-b389-00174486fc84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:25 np0005588919 NetworkManager[49104]: <info>  [1768919785.0064] manager: (tapf9059531-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.017 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.019 225859 INFO os_vif [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6')#033[00m
Jan 20 09:36:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.057 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.059 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4560MB free_disk=20.912002563476562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.059 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.060 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.070 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.071 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.071 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No VIF found with MAC fa:16:3e:5e:a4:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.072 225859 INFO nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Using config drive#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.098 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.160 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 59387c9d-df91-4f43-b389-00174486fc84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.160 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.160 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.205 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:25.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:36:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2085895492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.675 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.680 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.696 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.711 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:36:25 np0005588919 nova_compute[225855]: 2026-01-20 14:36:25.712 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.131 225859 INFO nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Creating config drive at /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.136 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg9qke6yn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.264 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg9qke6yn" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.295 225859 DEBUG nova.storage.rbd_utils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] rbd image 59387c9d-df91-4f43-b389-00174486fc84_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.300 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config 59387c9d-df91-4f43-b389-00174486fc84_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.327 225859 DEBUG nova.network.neutron [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Updated VIF entry in instance network info cache for port f9059531-e6dc-4451-9c17-ec3b63e4b85f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.329 225859 DEBUG nova.network.neutron [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Updating instance_info_cache with network_info: [{"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.350 225859 DEBUG oslo_concurrency.lockutils [req-42fd0086-13c1-4d18-a974-8ac33f18fb9f req-0561c550-7ff4-4448-aaa6-8ca74d01ae27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-59387c9d-df91-4f43-b389-00174486fc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.660 225859 DEBUG oslo_concurrency.processutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config 59387c9d-df91-4f43-b389-00174486fc84_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.361s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.661 225859 INFO nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Deleting local config drive /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84/disk.config because it was imported into RBD.#033[00m
Jan 20 09:36:26 np0005588919 kernel: tapf9059531-e6: entered promiscuous mode
Jan 20 09:36:26 np0005588919 NetworkManager[49104]: <info>  [1768919786.7147] manager: (tapf9059531-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 20 09:36:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:26Z|00167|binding|INFO|Claiming lport f9059531-e6dc-4451-9c17-ec3b63e4b85f for this chassis.
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:26Z|00168|binding|INFO|f9059531-e6dc-4451-9c17-ec3b63e4b85f: Claiming fa:16:3e:5e:a4:97 10.100.0.13
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.732 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.739 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:a4:97 10.100.0.13'], port_security=['fa:16:3e:5e:a4:97 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '59387c9d-df91-4f43-b389-00174486fc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f9059531-e6dc-4451-9c17-ec3b63e4b85f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.740 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f9059531-e6dc-4451-9c17-ec3b63e4b85f in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a bound to our chassis#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.741 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network abb83e3e-0b12-431b-ad86-a1d271b5b46a#033[00m
Jan 20 09:36:26 np0005588919 systemd-udevd[248504]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:36:26 np0005588919 systemd-machined[194361]: New machine qemu-26-instance-00000035.
Jan 20 09:36:26 np0005588919 NetworkManager[49104]: <info>  [1768919786.7582] device (tapf9059531-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:36:26 np0005588919 NetworkManager[49104]: <info>  [1768919786.7588] device (tapf9059531-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:36:26 np0005588919 systemd[1]: Started Virtual Machine qemu-26-instance-00000035.
Jan 20 09:36:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:26.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.766 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[48e5a0da-ce41-43a3-b2b5-eda0d0d44fe9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.767 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapabb83e3e-01 in ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.769 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapabb83e3e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.769 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[609cd498-8d46-41d3-9259-df616264300e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.769 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[33fdcc8d-5d70-4afe-8fc8-de39f835bfff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.786 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[7517bff3-2aaa-4117-beac-9a210aa018d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.818 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3ee5e4-125c-4e9e-8268-5af005c73983]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:26Z|00169|binding|INFO|Setting lport f9059531-e6dc-4451-9c17-ec3b63e4b85f ovn-installed in OVS
Jan 20 09:36:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:26Z|00170|binding|INFO|Setting lport f9059531-e6dc-4451-9c17-ec3b63e4b85f up in Southbound
Jan 20 09:36:26 np0005588919 nova_compute[225855]: 2026-01-20 14:36:26.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.853 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3e508503-b09d-4983-be58-96de858c7735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.857 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d533ecc1-9266-4971-a7e3-dd4968cadb0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 systemd-udevd[248508]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:36:26 np0005588919 NetworkManager[49104]: <info>  [1768919786.8586] manager: (tapabb83e3e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.889 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8995e6-baf5-403f-9c49-b0a03298610c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.892 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[51f40e0a-3b92-42b8-b47b-4717e1cde276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 NetworkManager[49104]: <info>  [1768919786.9146] device (tapabb83e3e-00): carrier: link connected
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.919 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6c90e64d-c606-4893-b6ac-3c009c820190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.933 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e86e03-5417-42cf-a326-dd3bd2bfc718]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482584, 'reachable_time': 25192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248538, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.946 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f81fb0c8-ed51-4503-90fc-04f3f5fc3d1d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482584, 'tstamp': 482584}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248539, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.960 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0c69b903-9598-4fd0-8c72-43ffb417bc20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapabb83e3e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:0b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482584, 'reachable_time': 25192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248540, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:26.987 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[18d35d01-448c-4661-8134-8a3fa05e3e0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.042 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a01cb7-2b72-4619-9f12-d8b57273d427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.044 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.045 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.045 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabb83e3e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:27 np0005588919 kernel: tapabb83e3e-00: entered promiscuous mode
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:27 np0005588919 NetworkManager[49104]: <info>  [1768919787.0527] manager: (tapabb83e3e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.055 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapabb83e3e-00, col_values=(('external_ids', {'iface-id': 'dfacaf19-f896-4c13-a7ad-47b57cf03fc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:27Z|00171|binding|INFO|Releasing lport dfacaf19-f896-4c13-a7ad-47b57cf03fc1 from this chassis (sb_readonly=0)
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.059 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.060 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7fbe1f-65b1-4bbd-b8bf-f52eb645a0b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.061 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/abb83e3e-0b12-431b-ad86-a1d271b5b46a.pid.haproxy
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID abb83e3e-0b12-431b-ad86-a1d271b5b46a
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:36:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:27.062 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'env', 'PROCESS_TAG=haproxy-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/abb83e3e-0b12-431b-ad86-a1d271b5b46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.073 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.283 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919787.2827053, 59387c9d-df91-4f43-b389-00174486fc84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.283 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] VM Started (Lifecycle Event)#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.305 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.310 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919787.2835515, 59387c9d-df91-4f43-b389-00174486fc84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.310 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.331 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.335 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.359 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:27 np0005588919 podman[248614]: 2026-01-20 14:36:27.453952531 +0000 UTC m=+0.062790113 container create a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:36:27 np0005588919 systemd[1]: Started libpod-conmon-a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac.scope.
Jan 20 09:36:27 np0005588919 podman[248614]: 2026-01-20 14:36:27.417678858 +0000 UTC m=+0.026516440 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:36:27 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:36:27 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4be9f6ce425d00a5cc617685f10663b2cda4fb6c10c8d6add8670f4d6e110ceb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:36:27 np0005588919 podman[248614]: 2026-01-20 14:36:27.564381068 +0000 UTC m=+0.173218680 container init a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:36:27 np0005588919 podman[248614]: 2026-01-20 14:36:27.574604507 +0000 UTC m=+0.183442089 container start a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:36:27 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [NOTICE]   (248634) : New worker (248636) forked
Jan 20 09:36:27 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [NOTICE]   (248634) : Loading success.
Jan 20 09:36:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:27.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.684 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.703 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:27 np0005588919 nova_compute[225855]: 2026-01-20 14:36:27.703 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.533 225859 DEBUG nova.compute.manager [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Received event network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.534 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.534 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.534 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.535 225859 DEBUG nova.compute.manager [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Processing event network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.535 225859 DEBUG nova.compute.manager [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Received event network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.535 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.535 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.536 225859 DEBUG oslo_concurrency.lockutils [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.536 225859 DEBUG nova.compute.manager [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] No waiting events found dispatching network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.536 225859 WARNING nova.compute.manager [req-b03cf418-d8ee-411c-a06e-eb99eef412af req-81e1b353-c8b5-4a1c-92af-585e8882ed8f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Received unexpected event network-vif-plugged-f9059531-e6dc-4451-9c17-ec3b63e4b85f for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.537 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.544 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919788.544152, 59387c9d-df91-4f43-b389-00174486fc84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.544 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.547 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.550 225859 INFO nova.virt.libvirt.driver [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Instance spawned successfully.#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.551 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.578 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.583 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.584 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.585 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.585 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.586 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.586 225859 DEBUG nova.virt.libvirt.driver [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.591 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.623 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.661 225859 INFO nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Took 9.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.661 225859 DEBUG nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.721 225859 INFO nova.compute.manager [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Took 10.96 seconds to build instance.#033[00m
Jan 20 09:36:28 np0005588919 nova_compute[225855]: 2026-01-20 14:36:28.748 225859 DEBUG oslo_concurrency.lockutils [None req-728666bb-0e91-4b68-b737-aae579f8c119 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:36:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:28.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:29 np0005588919 nova_compute[225855]: 2026-01-20 14:36:29.604 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:29.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:30 np0005588919 nova_compute[225855]: 2026-01-20 14:36:30.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:30 np0005588919 nova_compute[225855]: 2026-01-20 14:36:30.641 225859 DEBUG nova.compute.manager [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:36:30 np0005588919 nova_compute[225855]: 2026-01-20 14:36:30.689 225859 INFO nova.compute.manager [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] instance snapshotting#033[00m
Jan 20 09:36:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:30.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:30 np0005588919 nova_compute[225855]: 2026-01-20 14:36:30.975 225859 INFO nova.virt.libvirt.driver [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Beginning live snapshot process#033[00m
Jan 20 09:36:31 np0005588919 nova_compute[225855]: 2026-01-20 14:36:31.399 225859 DEBUG nova.virt.libvirt.imagebackend [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:36:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:31.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:31 np0005588919 nova_compute[225855]: 2026-01-20 14:36:31.771 225859 DEBUG nova.storage.rbd_utils [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(541e8cdc347946dcb7aca41472a67483) on rbd image(59387c9d-df91-4f43-b389-00174486fc84_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:36:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 20 09:36:32 np0005588919 nova_compute[225855]: 2026-01-20 14:36:32.387 225859 DEBUG nova.storage.rbd_utils [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] cloning vms/59387c9d-df91-4f43-b389-00174486fc84_disk@541e8cdc347946dcb7aca41472a67483 to images/d9608a6b-abac-47e3-a9dd-70a6230a92c0 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:36:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 20 09:36:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:32.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:32 np0005588919 nova_compute[225855]: 2026-01-20 14:36:32.802 225859 DEBUG nova.storage.rbd_utils [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] flattening images/d9608a6b-abac-47e3-a9dd-70a6230a92c0 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:36:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:33.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:33 np0005588919 nova_compute[225855]: 2026-01-20 14:36:33.946 225859 DEBUG nova.storage.rbd_utils [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] removing snapshot(541e8cdc347946dcb7aca41472a67483) on rbd image(59387c9d-df91-4f43-b389-00174486fc84_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:36:34 np0005588919 nova_compute[225855]: 2026-01-20 14:36:34.605 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:34.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 20 09:36:35 np0005588919 nova_compute[225855]: 2026-01-20 14:36:35.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:35 np0005588919 nova_compute[225855]: 2026-01-20 14:36:35.060 225859 DEBUG nova.storage.rbd_utils [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] creating snapshot(snap) on rbd image(d9608a6b-abac-47e3-a9dd-70a6230a92c0) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:36:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:35.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 20 09:36:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:36.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:37.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:38 np0005588919 nova_compute[225855]: 2026-01-20 14:36:38.423 225859 INFO nova.virt.libvirt.driver [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Snapshot image upload complete#033[00m
Jan 20 09:36:38 np0005588919 nova_compute[225855]: 2026-01-20 14:36:38.424 225859 INFO nova.compute.manager [None req-8f0d650b-1046-4a77-a3f4-7f3e4363c0c0 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Took 7.73 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:36:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:38.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:39 np0005588919 nova_compute[225855]: 2026-01-20 14:36:39.608 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:39.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:40 np0005588919 nova_compute[225855]: 2026-01-20 14:36:40.009 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:40.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:41Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:a4:97 10.100.0.13
Jan 20 09:36:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:36:41Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:a4:97 10.100.0.13
Jan 20 09:36:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:41.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:42.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 20 09:36:43 np0005588919 podman[248817]: 2026-01-20 14:36:43.641425851 +0000 UTC m=+0.150310144 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:36:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:43.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:44 np0005588919 nova_compute[225855]: 2026-01-20 14:36:44.613 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:44.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:45 np0005588919 nova_compute[225855]: 2026-01-20 14:36:45.012 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:45.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:46.189 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:36:46 np0005588919 nova_compute[225855]: 2026-01-20 14:36:46.191 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:46.192 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:36:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:46.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:47.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:48.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:49 np0005588919 nova_compute[225855]: 2026-01-20 14:36:49.613 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:49.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:50 np0005588919 nova_compute[225855]: 2026-01-20 14:36:50.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:50.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:36:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:51.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:36:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:36:52.195 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:36:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:52.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:53.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:54 np0005588919 podman[248876]: 2026-01-20 14:36:54.023925552 +0000 UTC m=+0.060154308 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:36:54 np0005588919 nova_compute[225855]: 2026-01-20 14:36:54.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:36:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:54.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:36:55 np0005588919 nova_compute[225855]: 2026-01-20 14:36:55.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:36:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:55.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:56.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:57.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:36:58.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:36:59 np0005588919 nova_compute[225855]: 2026-01-20 14:36:59.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:36:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:36:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:36:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:36:59.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:00 np0005588919 nova_compute[225855]: 2026-01-20 14:37:00.018 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:00.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:01.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:02.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:02 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:37:02 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:37:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:03.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:04 np0005588919 nova_compute[225855]: 2026-01-20 14:37:04.685 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:04.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:05 np0005588919 nova_compute[225855]: 2026-01-20 14:37:05.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:05.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:06.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 20 09:37:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:07.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.963 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.963 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.963 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "59387c9d-df91-4f43-b389-00174486fc84-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.964 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.964 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.965 225859 INFO nova.compute.manager [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Terminating instance#033[00m
Jan 20 09:37:07 np0005588919 nova_compute[225855]: 2026-01-20 14:37:07.966 225859 DEBUG nova.compute.manager [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:37:08 np0005588919 kernel: tapf9059531-e6 (unregistering): left promiscuous mode
Jan 20 09:37:08 np0005588919 NetworkManager[49104]: <info>  [1768919828.0236] device (tapf9059531-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:37:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:08Z|00172|binding|INFO|Releasing lport f9059531-e6dc-4451-9c17-ec3b63e4b85f from this chassis (sb_readonly=0)
Jan 20 09:37:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:08Z|00173|binding|INFO|Setting lport f9059531-e6dc-4451-9c17-ec3b63e4b85f down in Southbound
Jan 20 09:37:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:08Z|00174|binding|INFO|Removing iface tapf9059531-e6 ovn-installed in OVS
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.031 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.034 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.043 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:a4:97 10.100.0.13'], port_security=['fa:16:3e:5e:a4:97 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '59387c9d-df91-4f43-b389-00174486fc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3750c56415134773aa9d9880038f1749', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e302063-2ccd-4f7c-8835-ef521762a486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4125934e-1dea-4e34-a38d-5291c850f0b2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f9059531-e6dc-4451-9c17-ec3b63e4b85f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.045 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f9059531-e6dc-4451-9c17-ec3b63e4b85f in datapath abb83e3e-0b12-431b-ad86-a1d271b5b46a unbound from our chassis#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.046 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network abb83e3e-0b12-431b-ad86-a1d271b5b46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.047 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea57fca-8e36-481f-8b74-04e9dbd7f7e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.048 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a namespace which is not needed anymore#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 20 09:37:08 np0005588919 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000035.scope: Consumed 14.303s CPU time.
Jan 20 09:37:08 np0005588919 systemd-machined[194361]: Machine qemu-26-instance-00000035 terminated.
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [NOTICE]   (248634) : haproxy version is 2.8.14-c23fe91
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [NOTICE]   (248634) : path to executable is /usr/sbin/haproxy
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [WARNING]  (248634) : Exiting Master process...
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [WARNING]  (248634) : Exiting Master process...
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [ALERT]    (248634) : Current worker (248636) exited with code 143 (Terminated)
Jan 20 09:37:08 np0005588919 neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a[248629]: [WARNING]  (248634) : All workers exited. Exiting... (0)
Jan 20 09:37:08 np0005588919 systemd[1]: libpod-a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac.scope: Deactivated successfully.
Jan 20 09:37:08 np0005588919 podman[249107]: 2026-01-20 14:37:08.193097123 +0000 UTC m=+0.055031754 container died a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.206 225859 INFO nova.virt.libvirt.driver [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Instance destroyed successfully.#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.208 225859 DEBUG nova.objects.instance [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lazy-loading 'resources' on Instance uuid 59387c9d-df91-4f43-b389-00174486fc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac-userdata-shm.mount: Deactivated successfully.
Jan 20 09:37:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay-4be9f6ce425d00a5cc617685f10663b2cda4fb6c10c8d6add8670f4d6e110ceb-merged.mount: Deactivated successfully.
Jan 20 09:37:08 np0005588919 podman[249107]: 2026-01-20 14:37:08.230353945 +0000 UTC m=+0.092288576 container cleanup a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.236 225859 DEBUG nova.virt.libvirt.vif [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1053628830',display_name='tempest-ImagesTestJSON-server-1053628830',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1053628830',id=53,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:36:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3750c56415134773aa9d9880038f1749',ramdisk_id='',reservation_id='r-2v7i5wvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-338390217',owner_user_name='tempest-ImagesTestJSON-338390217-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:36:38Z,user_data=None,user_id='56e2959629114d3d8a48e7a80ed96c4b',uuid=59387c9d-df91-4f43-b389-00174486fc84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.237 225859 DEBUG nova.network.os_vif_util [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converting VIF {"id": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "address": "fa:16:3e:5e:a4:97", "network": {"id": "abb83e3e-0b12-431b-ad86-a1d271b5b46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-766235638-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3750c56415134773aa9d9880038f1749", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9059531-e6", "ovs_interfaceid": "f9059531-e6dc-4451-9c17-ec3b63e4b85f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.238 225859 DEBUG nova.network.os_vif_util [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.238 225859 DEBUG os_vif [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.240 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.241 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9059531-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.244 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.247 225859 INFO os_vif [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:a4:97,bridge_name='br-int',has_traffic_filtering=True,id=f9059531-e6dc-4451-9c17-ec3b63e4b85f,network=Network(abb83e3e-0b12-431b-ad86-a1d271b5b46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9059531-e6')#033[00m
Jan 20 09:37:08 np0005588919 systemd[1]: libpod-conmon-a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac.scope: Deactivated successfully.
Jan 20 09:37:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:37:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:37:08 np0005588919 podman[249150]: 2026-01-20 14:37:08.308027747 +0000 UTC m=+0.051198366 container remove a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.313 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b24ddd82-f78d-441f-aecb-a72f47074a8b]: (4, ('Tue Jan 20 02:37:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac)\na034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac\nTue Jan 20 02:37:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a (a034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac)\na034a5d55e70bf048ac50890668baf65a5a1a8160f7f28acce0ccefec6537cac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.315 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a1db865d-9d28-422e-83d3-f13e2623bfd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.316 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabb83e3e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:08 np0005588919 kernel: tapabb83e3e-00: left promiscuous mode
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.446 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.461 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.462 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.464 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c369fa64-066b-4c87-8ec2-4fe65b6258a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.478 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[286dde08-fdce-4b7c-8057-11fdaffbea1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.480 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c31d67c2-5f69-45e3-b527-1e63d036dc58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.495 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e061ba13-fb95-4106-ab46-23205a676493]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482577, 'reachable_time': 27519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249184, 'error': None, 'target': 'ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.498 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-abb83e3e-0b12-431b-ad86-a1d271b5b46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:37:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:08.498 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[f580e933-e19a-4e42-b5d1-45606bb74bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:08 np0005588919 systemd[1]: run-netns-ovnmeta\x2dabb83e3e\x2d0b12\x2d431b\x2dad86\x2da1d271b5b46a.mount: Deactivated successfully.
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.760 225859 INFO nova.virt.libvirt.driver [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Deleting instance files /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84_del#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.761 225859 INFO nova.virt.libvirt.driver [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Deletion of /var/lib/nova/instances/59387c9d-df91-4f43-b389-00174486fc84_del complete#033[00m
Jan 20 09:37:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:08.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.840 225859 INFO nova.compute.manager [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.840 225859 DEBUG oslo.service.loopingcall [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.841 225859 DEBUG nova.compute.manager [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:37:08 np0005588919 nova_compute[225855]: 2026-01-20 14:37:08.841 225859 DEBUG nova.network.neutron [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:37:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:09.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:09 np0005588919 nova_compute[225855]: 2026-01-20 14:37:09.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.477 225859 DEBUG nova.network.neutron [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.501 225859 INFO nova.compute.manager [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Took 1.66 seconds to deallocate network for instance.#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.588 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.589 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.634 225859 DEBUG nova.compute.manager [req-6fa74c7a-287b-422b-95b6-e62120fe6281 req-7e30ef4e-162a-40f3-829c-5079f6ed9100 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Received event network-vif-deleted-f9059531-e6dc-4451-9c17-ec3b63e4b85f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:10 np0005588919 nova_compute[225855]: 2026-01-20 14:37:10.655 225859 DEBUG oslo_concurrency.processutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:10.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/998884099' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.253 225859 DEBUG oslo_concurrency.processutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.598s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.259 225859 DEBUG nova.compute.provider_tree [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.289 225859 DEBUG nova.scheduler.client.report [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.356 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.394 225859 INFO nova.scheduler.client.report [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Deleted allocations for instance 59387c9d-df91-4f43-b389-00174486fc84#033[00m
Jan 20 09:37:11 np0005588919 nova_compute[225855]: 2026-01-20 14:37:11.521 225859 DEBUG oslo_concurrency.lockutils [None req-b1a140e3-48f6-4754-b3f1-f895eb3c6fb2 56e2959629114d3d8a48e7a80ed96c4b 3750c56415134773aa9d9880038f1749 - - default default] Lock "59387c9d-df91-4f43-b389-00174486fc84" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:11.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 20 09:37:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:12.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 20 09:37:13 np0005588919 nova_compute[225855]: 2026-01-20 14:37:13.263 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1783961933' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1783961933' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:37:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:13.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:37:14 np0005588919 podman[249261]: 2026-01-20 14:37:14.037148805 +0000 UTC m=+0.076855041 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:37:14 np0005588919 nova_compute[225855]: 2026-01-20 14:37:14.742 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:14.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 20 09:37:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:15.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:16.396 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:16.397 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:16.397 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:16.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:17.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 20 09:37:18 np0005588919 nova_compute[225855]: 2026-01-20 14:37:18.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:18.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:19.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:19 np0005588919 nova_compute[225855]: 2026-01-20 14:37:19.744 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:20.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:21 np0005588919 nova_compute[225855]: 2026-01-20 14:37:21.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:37:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 23K writes, 92K keys, 23K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.04 MB/s#012Cumulative WAL: 23K writes, 7643 syncs, 3.01 writes per sync, written: 0.08 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 11K writes, 49K keys, 11K commit groups, 1.0 writes per commit group, ingest: 48.28 MB, 0.08 MB/s#012Interval WAL: 11K writes, 4527 syncs, 2.61 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:37:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:22.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 20 09:37:23 np0005588919 nova_compute[225855]: 2026-01-20 14:37:23.205 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919828.2039974, 59387c9d-df91-4f43-b389-00174486fc84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:23 np0005588919 nova_compute[225855]: 2026-01-20 14:37:23.206 225859 INFO nova.compute.manager [-] [instance: 59387c9d-df91-4f43-b389-00174486fc84] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:37:23 np0005588919 nova_compute[225855]: 2026-01-20 14:37:23.229 225859 DEBUG nova.compute.manager [None req-dca99096-b13a-4f5d-a020-2b949e73ed1b - - - - - -] [instance: 59387c9d-df91-4f43-b389-00174486fc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:23 np0005588919 nova_compute[225855]: 2026-01-20 14:37:23.267 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:23.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.522 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.522 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.522 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.522 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.523 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:24 np0005588919 nova_compute[225855]: 2026-01-20 14:37:24.746 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/754849335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.006 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:25 np0005588919 podman[249362]: 2026-01-20 14:37:25.026377633 +0000 UTC m=+0.070099919 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.171 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.173 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4628MB free_disk=20.957855224609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.173 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.173 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.270 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.270 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.310 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:25.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4289147269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.755 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.760 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.791 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.950 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:37:25 np0005588919 nova_compute[225855]: 2026-01-20 14:37:25.951 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:26.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.952 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.952 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.952 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.967 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.968 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.968 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:26 np0005588919 nova_compute[225855]: 2026-01-20 14:37:26.969 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 20 09:37:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:27.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:28 np0005588919 nova_compute[225855]: 2026-01-20 14:37:28.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:28 np0005588919 nova_compute[225855]: 2026-01-20 14:37:28.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:28.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:37:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:29.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.748 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.806 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.807 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.823 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.920 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.920 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.925 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:37:29 np0005588919 nova_compute[225855]: 2026-01-20 14:37:29.926 225859 INFO nova.compute.claims [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.047 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 20 09:37:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2696979408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.467 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.473 225859 DEBUG nova.compute.provider_tree [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.492 225859 DEBUG nova.scheduler.client.report [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.516 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.517 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:37:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:30.533 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:30.534 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.626 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.633 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.633 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.653 225859 INFO nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.673 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.803 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.804 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.805 225859 INFO nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Creating image(s)#033[00m
Jan 20 09:37:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:30.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.833 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.858 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.886 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.890 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.948 225859 DEBUG nova.policy [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '592a0204f38a4596ab1ab81774214a6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d78990d13704d629a8a3e8910d005c5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.956 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.956 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.957 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.957 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.982 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:30 np0005588919 nova_compute[225855]: 2026-01-20 14:37:30.986 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 98e22622-b8b8-44a5-befe-1bd745f9c946_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.267 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 98e22622-b8b8-44a5-befe-1bd745f9c946_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.355 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] resizing rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.456 225859 DEBUG nova.objects.instance [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'migration_context' on Instance uuid 98e22622-b8b8-44a5-befe-1bd745f9c946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.469 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.470 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Ensure instance console log exists: /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.470 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.471 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:31 np0005588919 nova_compute[225855]: 2026-01-20 14:37:31.471 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:31.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:32 np0005588919 nova_compute[225855]: 2026-01-20 14:37:32.540 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Successfully created port: 25ba0729-4796-48e4-9b7a-6c0716d26545 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:37:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:32.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:33 np0005588919 nova_compute[225855]: 2026-01-20 14:37:33.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:33.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:33 np0005588919 nova_compute[225855]: 2026-01-20 14:37:33.848 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Successfully updated port: 25ba0729-4796-48e4-9b7a-6c0716d26545 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:37:33 np0005588919 nova_compute[225855]: 2026-01-20 14:37:33.881 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:37:33 np0005588919 nova_compute[225855]: 2026-01-20 14:37:33.881 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquired lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:37:33 np0005588919 nova_compute[225855]: 2026-01-20 14:37:33.881 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:37:34 np0005588919 nova_compute[225855]: 2026-01-20 14:37:34.297 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:37:34 np0005588919 nova_compute[225855]: 2026-01-20 14:37:34.520 225859 DEBUG nova.compute.manager [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-changed-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:34 np0005588919 nova_compute[225855]: 2026-01-20 14:37:34.521 225859 DEBUG nova.compute.manager [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Refreshing instance network info cache due to event network-changed-25ba0729-4796-48e4-9b7a-6c0716d26545. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:37:34 np0005588919 nova_compute[225855]: 2026-01-20 14:37:34.521 225859 DEBUG oslo_concurrency.lockutils [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:37:34 np0005588919 nova_compute[225855]: 2026-01-20 14:37:34.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:34.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.487 225859 DEBUG nova.network.neutron [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Updating instance_info_cache with network_info: [{"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.557 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Releasing lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.557 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Instance network_info: |[{"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.557 225859 DEBUG oslo_concurrency.lockutils [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.558 225859 DEBUG nova.network.neutron [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Refreshing network info cache for port 25ba0729-4796-48e4-9b7a-6c0716d26545 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.560 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Start _get_guest_xml network_info=[{"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.566 225859 WARNING nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.570 225859 DEBUG nova.virt.libvirt.host [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.570 225859 DEBUG nova.virt.libvirt.host [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.573 225859 DEBUG nova.virt.libvirt.host [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.573 225859 DEBUG nova.virt.libvirt.host [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.574 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.574 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.575 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.575 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.575 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.575 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.576 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.576 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.576 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.576 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.577 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.577 225859 DEBUG nova.virt.hardware [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:37:35 np0005588919 nova_compute[225855]: 2026-01-20 14:37:35.579 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:35.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:37:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2472477865' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.012 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.040 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.045 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:37:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3403050555' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.524 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.528 225859 DEBUG nova.virt.libvirt.vif [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:37:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-127847800',display_name='tempest-ImagesOneServerNegativeTestJSON-server-127847800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-127847800',id=58,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-5l39qm5d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:37:30Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=98e22622-b8b8-44a5-befe-1bd745f9c946,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.529 225859 DEBUG nova.network.os_vif_util [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.531 225859 DEBUG nova.network.os_vif_util [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.534 225859 DEBUG nova.objects.instance [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 98e22622-b8b8-44a5-befe-1bd745f9c946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:36.536 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.565 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <uuid>98e22622-b8b8-44a5-befe-1bd745f9c946</uuid>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <name>instance-0000003a</name>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-127847800</nova:name>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:37:35</nova:creationTime>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:user uuid="592a0204f38a4596ab1ab81774214a6d">tempest-ImagesOneServerNegativeTestJSON-866315696-project-member</nova:user>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:project uuid="7d78990d13704d629a8a3e8910d005c5">tempest-ImagesOneServerNegativeTestJSON-866315696</nova:project>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <nova:port uuid="25ba0729-4796-48e4-9b7a-6c0716d26545">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="serial">98e22622-b8b8-44a5-befe-1bd745f9c946</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="uuid">98e22622-b8b8-44a5-befe-1bd745f9c946</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/98e22622-b8b8-44a5-befe-1bd745f9c946_disk">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:23:1a:21"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <target dev="tap25ba0729-47"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/console.log" append="off"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:37:36 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:37:36 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:37:36 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:37:36 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.567 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Preparing to wait for external event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.568 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.568 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.568 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.569 225859 DEBUG nova.virt.libvirt.vif [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:37:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-127847800',display_name='tempest-ImagesOneServerNegativeTestJSON-server-127847800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-127847800',id=58,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-5l39qm5d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:37:30Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=98e22622-b8b8-44a5-befe-1bd745f9c946,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.569 225859 DEBUG nova.network.os_vif_util [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.569 225859 DEBUG nova.network.os_vif_util [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.570 225859 DEBUG os_vif [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.570 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.571 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.571 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.574 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.574 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25ba0729-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.575 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25ba0729-47, col_values=(('external_ids', {'iface-id': '25ba0729-4796-48e4-9b7a-6c0716d26545', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:1a:21', 'vm-uuid': '98e22622-b8b8-44a5-befe-1bd745f9c946'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.576 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:36 np0005588919 NetworkManager[49104]: <info>  [1768919856.5772] manager: (tap25ba0729-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.582 225859 INFO os_vif [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47')#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.623 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.624 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.624 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] No VIF found with MAC fa:16:3e:23:1a:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.624 225859 INFO nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Using config drive#033[00m
Jan 20 09:37:36 np0005588919 nova_compute[225855]: 2026-01-20 14:37:36.651 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:37:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.176 225859 INFO nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Creating config drive at /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.180 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zyaq7rr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.328 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zyaq7rr" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.362 225859 DEBUG nova.storage.rbd_utils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] rbd image 98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.367 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config 98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.432 225859 DEBUG nova.network.neutron [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Updated VIF entry in instance network info cache for port 25ba0729-4796-48e4-9b7a-6c0716d26545. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.433 225859 DEBUG nova.network.neutron [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Updating instance_info_cache with network_info: [{"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.452 225859 DEBUG oslo_concurrency.lockutils [req-38e57a0d-2523-429c-8006-28c6843ec8f2 req-2f3b7776-5a31-4a72-ab50-2a44f88596ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-98e22622-b8b8-44a5-befe-1bd745f9c946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.624 225859 DEBUG oslo_concurrency.processutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config 98e22622-b8b8-44a5-befe-1bd745f9c946_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.624 225859 INFO nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Deleting local config drive /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946/disk.config because it was imported into RBD.#033[00m
Jan 20 09:37:37 np0005588919 kernel: tap25ba0729-47: entered promiscuous mode
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.685 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:37 np0005588919 NetworkManager[49104]: <info>  [1768919857.6877] manager: (tap25ba0729-47): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 20 09:37:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:37Z|00175|binding|INFO|Claiming lport 25ba0729-4796-48e4-9b7a-6c0716d26545 for this chassis.
Jan 20 09:37:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:37Z|00176|binding|INFO|25ba0729-4796-48e4-9b7a-6c0716d26545: Claiming fa:16:3e:23:1a:21 10.100.0.11
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.691 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.697 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:1a:21 10.100.0.11'], port_security=['fa:16:3e:23:1a:21 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '98e22622-b8b8-44a5-befe-1bd745f9c946', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d78990d13704d629a8a3e8910d005c5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3763ece7-c739-40ca-8e07-6dde1584ba85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a613141e-df34-49c4-9712-c3d232327d6b, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=25ba0729-4796-48e4-9b7a-6c0716d26545) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.698 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 25ba0729-4796-48e4-9b7a-6c0716d26545 in datapath b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 bound to our chassis#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.700 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.713 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1de23175-9516-4af4-adb4-f0f726f43e66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.714 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1f372f9-f1 in ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:37:37 np0005588919 systemd-machined[194361]: New machine qemu-27-instance-0000003a.
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.716 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1f372f9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.716 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e9d187-d127-4f37-b064-a15c2d1b163a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.717 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0977e0-c27e-4bf3-8e46-8e6bcdc5f8bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:37.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.734 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[1e90133e-ee08-4cff-bca8-6f1952d369ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.757 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[18d913dd-867f-4190-adb6-0b3d46834648]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 systemd[1]: Started Virtual Machine qemu-27-instance-0000003a.
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:37Z|00177|binding|INFO|Setting lport 25ba0729-4796-48e4-9b7a-6c0716d26545 ovn-installed in OVS
Jan 20 09:37:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:37Z|00178|binding|INFO|Setting lport 25ba0729-4796-48e4-9b7a-6c0716d26545 up in Southbound
Jan 20 09:37:37 np0005588919 nova_compute[225855]: 2026-01-20 14:37:37.798 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.802 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[727c94d8-09ff-4286-8b27-c1debbae1728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.806 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bf822d-2f62-449e-a813-3bec8b04ac1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 NetworkManager[49104]: <info>  [1768919857.8085] manager: (tapb1f372f9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Jan 20 09:37:37 np0005588919 systemd-udevd[249742]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:37:37 np0005588919 systemd-udevd[249743]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:37:37 np0005588919 NetworkManager[49104]: <info>  [1768919857.8285] device (tap25ba0729-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:37:37 np0005588919 NetworkManager[49104]: <info>  [1768919857.8294] device (tap25ba0729-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.840 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1aff8a-12b7-4970-a598-71710c1fc827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.844 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f3310ea2-f7f2-4bcc-b029-f963ca566261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 20 09:37:37 np0005588919 NetworkManager[49104]: <info>  [1768919857.8805] device (tapb1f372f9-f0): carrier: link connected
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.888 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e0772fcb-cf5c-4c9b-a75d-c34cada52911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.906 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2d73acf6-f5f7-49d0-8391-fa4622203461]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1f372f9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:d0:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489680, 'reachable_time': 22290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249769, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.923 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f0615591-7411-40c5-a82e-220d4760102e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:d0c5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489680, 'tstamp': 489680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249771, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.945 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cc732bf7-09f4-4ed9-a123-d2283da7ebfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1f372f9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:d0:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489680, 'reachable_time': 22290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249772, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:37.980 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[db0dfd15-56a9-4396-990a-be9daca7154e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.040 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ac919d1e-1d9e-4935-b6d0-16b723386a36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.041 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f372f9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.041 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.042 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f372f9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588919 NetworkManager[49104]: <info>  [1768919858.0448] manager: (tapb1f372f9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 20 09:37:38 np0005588919 kernel: tapb1f372f9-f0: entered promiscuous mode
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.051 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1f372f9-f0, col_values=(('external_ids', {'iface-id': 'f0137d70-4bff-4646-9f70-7e0c82ac1e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:38Z|00179|binding|INFO|Releasing lport f0137d70-4bff-4646-9f70-7e0c82ac1e88 from this chassis (sb_readonly=0)
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.076 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.078 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.079 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[35c2221a-b970-4061-a6c1-94afdca79965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.079 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.pid.haproxy
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:37:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:38.080 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'env', 'PROCESS_TAG=haproxy-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.431 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919858.4316707, 98e22622-b8b8-44a5-befe-1bd745f9c946 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.432 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] VM Started (Lifecycle Event)#033[00m
Jan 20 09:37:38 np0005588919 podman[249845]: 2026-01-20 14:37:38.450493335 +0000 UTC m=+0.053788989 container create 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:37:38 np0005588919 systemd[1]: Started libpod-conmon-184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0.scope.
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.493 225859 DEBUG nova.compute.manager [req-d5282684-34f6-48d7-b01a-d8c5462da80b req-a46a1206-fc7d-491e-9271-5c4fc2ce1f01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.494 225859 DEBUG oslo_concurrency.lockutils [req-d5282684-34f6-48d7-b01a-d8c5462da80b req-a46a1206-fc7d-491e-9271-5c4fc2ce1f01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.494 225859 DEBUG oslo_concurrency.lockutils [req-d5282684-34f6-48d7-b01a-d8c5462da80b req-a46a1206-fc7d-491e-9271-5c4fc2ce1f01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.495 225859 DEBUG oslo_concurrency.lockutils [req-d5282684-34f6-48d7-b01a-d8c5462da80b req-a46a1206-fc7d-491e-9271-5c4fc2ce1f01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.495 225859 DEBUG nova.compute.manager [req-d5282684-34f6-48d7-b01a-d8c5462da80b req-a46a1206-fc7d-491e-9271-5c4fc2ce1f01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Processing event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.495 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.500 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.502 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.506 225859 INFO nova.virt.libvirt.driver [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Instance spawned successfully.#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.506 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.510 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:37:38 np0005588919 podman[249845]: 2026-01-20 14:37:38.422142395 +0000 UTC m=+0.025438069 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:37:38 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:37:38 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3ebbad909f05951835ad49a3aeee9e1168ceb25c4d16b8266150b05185476/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.530 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.531 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.531 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.532 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.532 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.533 225859 DEBUG nova.virt.libvirt.driver [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.536 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.536 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919858.4323697, 98e22622-b8b8-44a5-befe-1bd745f9c946 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.536 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:37:38 np0005588919 podman[249845]: 2026-01-20 14:37:38.539081256 +0000 UTC m=+0.142376930 container init 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:37:38 np0005588919 podman[249845]: 2026-01-20 14:37:38.544917451 +0000 UTC m=+0.148213125 container start 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:37:38 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [NOTICE]   (249865) : New worker (249867) forked
Jan 20 09:37:38 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [NOTICE]   (249865) : Loading success.
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.570 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.575 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919858.4996648, 98e22622-b8b8-44a5-befe-1bd745f9c946 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.575 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.693 225859 INFO nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Took 7.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.693 225859 DEBUG nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.723 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.726 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.753 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.766 225859 INFO nova.compute.manager [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Took 8.87 seconds to build instance.#033[00m
Jan 20 09:37:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:38.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:38 np0005588919 nova_compute[225855]: 2026-01-20 14:37:38.908 225859 DEBUG oslo_concurrency.lockutils [None req-59c3a9a4-7848-4b64-b142-2499305cdad5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:39.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:39 np0005588919 nova_compute[225855]: 2026-01-20 14:37:39.781 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.657 225859 DEBUG nova.compute.manager [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.658 225859 DEBUG oslo_concurrency.lockutils [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.659 225859 DEBUG oslo_concurrency.lockutils [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.659 225859 DEBUG oslo_concurrency.lockutils [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.660 225859 DEBUG nova.compute.manager [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] No waiting events found dispatching network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:40 np0005588919 nova_compute[225855]: 2026-01-20 14:37:40.660 225859 WARNING nova.compute.manager [req-62bc4a7d-1291-4b41-b1c5-019b6b572702 req-79e4a9d5-61c6-40c0-b143-8a43d128927a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received unexpected event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:37:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:40.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:41 np0005588919 nova_compute[225855]: 2026-01-20 14:37:41.397 225859 DEBUG nova.compute.manager [None req-a9a54335-d43d-496a-b97d-426d1d18ffe5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:41 np0005588919 nova_compute[225855]: 2026-01-20 14:37:41.575 225859 INFO nova.compute.manager [None req-a9a54335-d43d-496a-b97d-426d1d18ffe5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] instance snapshotting#033[00m
Jan 20 09:37:41 np0005588919 nova_compute[225855]: 2026-01-20 14:37:41.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:41.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:41 np0005588919 nova_compute[225855]: 2026-01-20 14:37:41.938 225859 WARNING nova.compute.manager [None req-a9a54335-d43d-496a-b97d-426d1d18ffe5 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Image not found during snapshot: nova.exception.ImageNotFound: Image d6eb065b-6bd9-4a87-ab49-a63678d86cff could not be found.#033[00m
Jan 20 09:37:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:42.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.978 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.978 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.978 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.979 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.979 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.980 225859 INFO nova.compute.manager [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Terminating instance#033[00m
Jan 20 09:37:42 np0005588919 nova_compute[225855]: 2026-01-20 14:37:42.981 225859 DEBUG nova.compute.manager [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:37:43 np0005588919 kernel: tap25ba0729-47 (unregistering): left promiscuous mode
Jan 20 09:37:43 np0005588919 NetworkManager[49104]: <info>  [1768919863.0149] device (tap25ba0729-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:43Z|00180|binding|INFO|Releasing lport 25ba0729-4796-48e4-9b7a-6c0716d26545 from this chassis (sb_readonly=0)
Jan 20 09:37:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:43Z|00181|binding|INFO|Setting lport 25ba0729-4796-48e4-9b7a-6c0716d26545 down in Southbound
Jan 20 09:37:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:37:43Z|00182|binding|INFO|Removing iface tap25ba0729-47 ovn-installed in OVS
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.027 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:1a:21 10.100.0.11'], port_security=['fa:16:3e:23:1a:21 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '98e22622-b8b8-44a5-befe-1bd745f9c946', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d78990d13704d629a8a3e8910d005c5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3763ece7-c739-40ca-8e07-6dde1584ba85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a613141e-df34-49c4-9712-c3d232327d6b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=25ba0729-4796-48e4-9b7a-6c0716d26545) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.028 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 25ba0729-4796-48e4-9b7a-6c0716d26545 in datapath b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 unbound from our chassis#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.030 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.032 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[497523bb-f590-4d1c-a90d-58959fc76151]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.034 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 namespace which is not needed anymore#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Jan 20 09:37:43 np0005588919 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000003a.scope: Consumed 5.263s CPU time.
Jan 20 09:37:43 np0005588919 systemd-machined[194361]: Machine qemu-27-instance-0000003a terminated.
Jan 20 09:37:43 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [NOTICE]   (249865) : haproxy version is 2.8.14-c23fe91
Jan 20 09:37:43 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [NOTICE]   (249865) : path to executable is /usr/sbin/haproxy
Jan 20 09:37:43 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [WARNING]  (249865) : Exiting Master process...
Jan 20 09:37:43 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [ALERT]    (249865) : Current worker (249867) exited with code 143 (Terminated)
Jan 20 09:37:43 np0005588919 neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23[249861]: [WARNING]  (249865) : All workers exited. Exiting... (0)
Jan 20 09:37:43 np0005588919 systemd[1]: libpod-184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0.scope: Deactivated successfully.
Jan 20 09:37:43 np0005588919 podman[249902]: 2026-01-20 14:37:43.168181675 +0000 UTC m=+0.041409730 container died 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:37:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay-2be3ebbad909f05951835ad49a3aeee9e1168ceb25c4d16b8266150b05185476-merged.mount: Deactivated successfully.
Jan 20 09:37:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0-userdata-shm.mount: Deactivated successfully.
Jan 20 09:37:43 np0005588919 podman[249902]: 2026-01-20 14:37:43.212805235 +0000 UTC m=+0.086033280 container cleanup 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.218 225859 INFO nova.virt.libvirt.driver [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Instance destroyed successfully.#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.219 225859 DEBUG nova.objects.instance [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lazy-loading 'resources' on Instance uuid 98e22622-b8b8-44a5-befe-1bd745f9c946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:37:43 np0005588919 systemd[1]: libpod-conmon-184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0.scope: Deactivated successfully.
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.239 225859 DEBUG nova.virt.libvirt.vif [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:37:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-127847800',display_name='tempest-ImagesOneServerNegativeTestJSON-server-127847800',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-127847800',id=58,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:37:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d78990d13704d629a8a3e8910d005c5',ramdisk_id='',reservation_id='r-5l39qm5d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-866315696',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-866315696-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:37:41Z,user_data=None,user_id='592a0204f38a4596ab1ab81774214a6d',uuid=98e22622-b8b8-44a5-befe-1bd745f9c946,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.240 225859 DEBUG nova.network.os_vif_util [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converting VIF {"id": "25ba0729-4796-48e4-9b7a-6c0716d26545", "address": "fa:16:3e:23:1a:21", "network": {"id": "b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-462971735-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d78990d13704d629a8a3e8910d005c5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25ba0729-47", "ovs_interfaceid": "25ba0729-4796-48e4-9b7a-6c0716d26545", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.241 225859 DEBUG nova.network.os_vif_util [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.241 225859 DEBUG os_vif [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.244 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25ba0729-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.266 225859 INFO os_vif [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:1a:21,bridge_name='br-int',has_traffic_filtering=True,id=25ba0729-4796-48e4-9b7a-6c0716d26545,network=Network(b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25ba0729-47')#033[00m
Jan 20 09:37:43 np0005588919 podman[249943]: 2026-01-20 14:37:43.293365569 +0000 UTC m=+0.056424424 container remove 184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.300 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf4b644-7401-49b3-aabf-c72e8accda74]: (4, ('Tue Jan 20 02:37:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 (184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0)\n184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0\nTue Jan 20 02:37:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 (184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0)\n184b158e09711f91160511cc9d1e573ae340b527e39ab76701e8a51b9a334aa0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.302 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0d3c93-c5ab-45fe-b34c-3df22d03e875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.304 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f372f9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 kernel: tapb1f372f9-f0: left promiscuous mode
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.321 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.323 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e752905-f366-46fd-853b-2f96250c9205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.343 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[00ff6918-7666-492c-a63e-8046fbe27a47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.345 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9de64a-e5d8-4aa3-8d7f-913531ad7141]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.357 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[38cf2ee7-487f-4329-a3bd-7a371e31d252]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489671, 'reachable_time': 30057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249976, 'error': None, 'target': 'ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.359 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1f372f9-fbd1-4ef7-9be7-ace7ce14bb23 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:37:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:37:43.359 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[520f0b2b-72b8-425d-a806-e92e43592525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:37:43 np0005588919 systemd[1]: run-netns-ovnmeta\x2db1f372f9\x2dfbd1\x2d4ef7\x2d9be7\x2dace7ce14bb23.mount: Deactivated successfully.
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.405 225859 DEBUG nova.compute.manager [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-unplugged-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.405 225859 DEBUG oslo_concurrency.lockutils [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.406 225859 DEBUG oslo_concurrency.lockutils [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.406 225859 DEBUG oslo_concurrency.lockutils [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.406 225859 DEBUG nova.compute.manager [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] No waiting events found dispatching network-vif-unplugged-25ba0729-4796-48e4-9b7a-6c0716d26545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.406 225859 DEBUG nova.compute.manager [req-cf1312e4-9450-4142-8a57-f16ad715ee07 req-dfbd023c-25bd-434b-8a30-36fe27392125 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-unplugged-25ba0729-4796-48e4-9b7a-6c0716d26545 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:37:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:43.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.810 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.965 225859 INFO nova.virt.libvirt.driver [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Deleting instance files /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946_del#033[00m
Jan 20 09:37:43 np0005588919 nova_compute[225855]: 2026-01-20 14:37:43.966 225859 INFO nova.virt.libvirt.driver [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Deletion of /var/lib/nova/instances/98e22622-b8b8-44a5-befe-1bd745f9c946_del complete#033[00m
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.020 225859 INFO nova.compute.manager [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.021 225859 DEBUG oslo.service.loopingcall [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.021 225859 DEBUG nova.compute.manager [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.021 225859 DEBUG nova.network.neutron [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:44 np0005588919 nova_compute[225855]: 2026-01-20 14:37:44.963 225859 DEBUG nova.network.neutron [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.005 225859 INFO nova.compute.manager [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Took 0.98 seconds to deallocate network for instance.#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.066 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.067 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:45 np0005588919 podman[250029]: 2026-01-20 14:37:45.076023405 +0000 UTC m=+0.120290467 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.117 225859 DEBUG oslo_concurrency.processutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:37:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:37:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1773620489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.605 225859 DEBUG oslo_concurrency.processutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.611 225859 DEBUG nova.compute.provider_tree [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.631 225859 DEBUG nova.scheduler.client.report [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.655 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.706 225859 INFO nova.scheduler.client.report [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Deleted allocations for instance 98e22622-b8b8-44a5-befe-1bd745f9c946#033[00m
Jan 20 09:37:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:45.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.788 225859 DEBUG nova.compute.manager [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.789 225859 DEBUG oslo_concurrency.lockutils [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.789 225859 DEBUG oslo_concurrency.lockutils [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.789 225859 DEBUG oslo_concurrency.lockutils [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.790 225859 DEBUG nova.compute.manager [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] No waiting events found dispatching network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.790 225859 WARNING nova.compute.manager [req-9a69a638-ed08-43c2-b400-a01496fc2638 req-5d87e747-a072-4c17-a20e-6c6dc45d05f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received unexpected event network-vif-plugged-25ba0729-4796-48e4-9b7a-6c0716d26545 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:37:45 np0005588919 nova_compute[225855]: 2026-01-20 14:37:45.809 225859 DEBUG oslo_concurrency.lockutils [None req-35de1d14-6ccc-4867-8f50-13b31c1a82ab 592a0204f38a4596ab1ab81774214a6d 7d78990d13704d629a8a3e8910d005c5 - - default default] Lock "98e22622-b8b8-44a5-befe-1bd745f9c946" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:37:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:46 np0005588919 nova_compute[225855]: 2026-01-20 14:37:46.780 225859 DEBUG nova.compute.manager [req-adaa3bc1-cf1e-43d1-aa74-31e517b73dbe req-9b3e73bc-0df7-440c-90ba-3b382dd0c099 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Received event network-vif-deleted-25ba0729-4796-48e4-9b7a-6c0716d26545 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:37:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:46.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:47.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:48 np0005588919 nova_compute[225855]: 2026-01-20 14:37:48.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:48.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:49.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:49 np0005588919 nova_compute[225855]: 2026-01-20 14:37:49.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:50.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:51.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:52.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:53 np0005588919 nova_compute[225855]: 2026-01-20 14:37:53.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:37:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:53.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:37:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:37:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6847 writes, 35K keys, 6847 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6846 writes, 6846 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1811 writes, 8924 keys, 1811 commit groups, 1.0 writes per commit group, ingest: 17.43 MB, 0.03 MB/s#012Interval WAL: 1810 writes, 1810 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     68.2      0.58              0.14        18    0.032       0      0       0.0       0.0#012  L6      1/0    9.71 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6    111.4     92.0      1.55              0.48        17    0.091     86K   9392       0.0       0.0#012 Sum      1/0    9.71 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.6     80.9     85.5      2.14              0.63        35    0.061     86K   9392       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.8     93.0     95.0      0.51              0.12         8    0.064     24K   2592       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    111.4     92.0      1.55              0.48        17    0.091     86K   9392       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     68.5      0.58              0.14        17    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.039, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.18 GB write, 0.08 MB/s write, 0.17 GB read, 0.07 MB/s read, 2.1 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 19.97 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000149 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1154,19.29 MB,6.34496%) FilterBlock(35,250.05 KB,0.0803245%) IndexBlock(35,444.61 KB,0.142825%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:37:54 np0005588919 nova_compute[225855]: 2026-01-20 14:37:54.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:54.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:55.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:37:56 np0005588919 podman[250086]: 2026-01-20 14:37:56.01555912 +0000 UTC m=+0.064361378 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:37:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:37:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:56.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:37:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:57.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:58 np0005588919 nova_compute[225855]: 2026-01-20 14:37:58.215 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919863.2139645, 98e22622-b8b8-44a5-befe-1bd745f9c946 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:37:58 np0005588919 nova_compute[225855]: 2026-01-20 14:37:58.215 225859 INFO nova.compute.manager [-] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:37:58 np0005588919 nova_compute[225855]: 2026-01-20 14:37:58.255 225859 DEBUG nova.compute.manager [None req-ac9f89af-dfa1-4fc6-b423-866abea6a614 - - - - - -] [instance: 98e22622-b8b8-44a5-befe-1bd745f9c946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:37:58 np0005588919 nova_compute[225855]: 2026-01-20 14:37:58.303 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:37:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:37:58.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:37:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:37:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:37:59.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:37:59 np0005588919 nova_compute[225855]: 2026-01-20 14:37:59.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:00.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:01.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:02.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:03 np0005588919 nova_compute[225855]: 2026-01-20 14:38:03.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:03.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:04 np0005588919 nova_compute[225855]: 2026-01-20 14:38:04.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:04.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:05.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:06.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:07.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:08 np0005588919 nova_compute[225855]: 2026-01-20 14:38:08.307 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:08.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:09.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:09 np0005588919 nova_compute[225855]: 2026-01-20 14:38:09.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:10.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:11 np0005588919 nova_compute[225855]: 2026-01-20 14:38:11.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:38:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:11.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:38:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:12.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:13 np0005588919 nova_compute[225855]: 2026-01-20 14:38:13.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:38:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1969300733' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:38:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:38:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1969300733' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:38:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:13.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:14 np0005588919 nova_compute[225855]: 2026-01-20 14:38:14.849 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:38:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:15 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:38:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:15.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:16 np0005588919 podman[250298]: 2026-01-20 14:38:16.115521601 +0000 UTC m=+0.150174860 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 09:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:16.397 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:16.398 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:16.398 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:16.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:17.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:18 np0005588919 nova_compute[225855]: 2026-01-20 14:38:18.312 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:18.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:19.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:19 np0005588919 nova_compute[225855]: 2026-01-20 14:38:19.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:20.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:21.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:22 np0005588919 nova_compute[225855]: 2026-01-20 14:38:22.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:38:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:22.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:23 np0005588919 nova_compute[225855]: 2026-01-20 14:38:23.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:23.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:24 np0005588919 nova_compute[225855]: 2026-01-20 14:38:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:24 np0005588919 nova_compute[225855]: 2026-01-20 14:38:24.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:38:24 np0005588919 nova_compute[225855]: 2026-01-20 14:38:24.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:38:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:24.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:24 np0005588919 nova_compute[225855]: 2026-01-20 14:38:24.936 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:25 np0005588919 nova_compute[225855]: 2026-01-20 14:38:25.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:25 np0005588919 nova_compute[225855]: 2026-01-20 14:38:25.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:25.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.379 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.380 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.381 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.382 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.382 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.425 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.426 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.426 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.426 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.427 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:26.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281049994' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:26 np0005588919 nova_compute[225855]: 2026-01-20 14:38:26.926 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:27 np0005588919 podman[250452]: 2026-01-20 14:38:27.02437264 +0000 UTC m=+0.063839853 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.174 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.175 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4678MB free_disk=20.942676544189453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.176 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.176 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.254 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.254 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.417 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.436 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.436 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.450 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.470 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:38:27 np0005588919 nova_compute[225855]: 2026-01-20 14:38:27.616 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:38:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:27.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:38:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1324154779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.062 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.068 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.091 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.113 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.113 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:38:28 np0005588919 nova_compute[225855]: 2026-01-20 14:38:28.317 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:28.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:29 np0005588919 nova_compute[225855]: 2026-01-20 14:38:29.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:29 np0005588919 nova_compute[225855]: 2026-01-20 14:38:29.365 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:29 np0005588919 nova_compute[225855]: 2026-01-20 14:38:29.366 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:29 np0005588919 nova_compute[225855]: 2026-01-20 14:38:29.366 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:38:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:38:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:29.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:38:29 np0005588919 nova_compute[225855]: 2026-01-20 14:38:29.938 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:30.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:31.005 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:38:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:31.006 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:38:31 np0005588919 nova_compute[225855]: 2026-01-20 14:38:31.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:31 np0005588919 nova_compute[225855]: 2026-01-20 14:38:31.374 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:38:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:31.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:38:32.008 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:38:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:32.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:33 np0005588919 nova_compute[225855]: 2026-01-20 14:38:33.318 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:33.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:34.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:34 np0005588919 nova_compute[225855]: 2026-01-20 14:38:34.990 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:35.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:36.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:37.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:38 np0005588919 nova_compute[225855]: 2026-01-20 14:38:38.320 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:38.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:39.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:39 np0005588919 nova_compute[225855]: 2026-01-20 14:38:39.992 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:40.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:41.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:43 np0005588919 nova_compute[225855]: 2026-01-20 14:38:43.322 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:43.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:44.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:45 np0005588919 nova_compute[225855]: 2026-01-20 14:38:45.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:45.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:46.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:47 np0005588919 podman[250554]: 2026-01-20 14:38:47.044044085 +0000 UTC m=+0.088079897 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:38:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:47.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.887048) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927887519, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2829, "num_deletes": 526, "total_data_size": 5727781, "memory_usage": 5817688, "flush_reason": "Manual Compaction"}
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927926680, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 3737600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33360, "largest_seqno": 36184, "table_properties": {"data_size": 3726364, "index_size": 6834, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 26949, "raw_average_key_size": 20, "raw_value_size": 3701618, "raw_average_value_size": 2791, "num_data_blocks": 296, "num_entries": 1326, "num_filter_entries": 1326, "num_deletions": 526, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919747, "oldest_key_time": 1768919747, "file_creation_time": 1768919927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 39703 microseconds, and 7470 cpu microseconds.
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.926751) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 3737600 bytes OK
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.926783) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928922) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928940) EVENT_LOG_v1 {"time_micros": 1768919927928934, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.928961) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 5714208, prev total WAL file size 5714208, number of live WAL files 2.
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.930699) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(3650KB)], [63(9942KB)]
Jan 20 09:38:47 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919927930730, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 13918435, "oldest_snapshot_seqno": -1}
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6238 keys, 11936636 bytes, temperature: kUnknown
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928061241, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 11936636, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11891289, "index_size": 28661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 159267, "raw_average_key_size": 25, "raw_value_size": 11775603, "raw_average_value_size": 1887, "num_data_blocks": 1154, "num_entries": 6238, "num_filter_entries": 6238, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.061527) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 11936636 bytes
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.063424) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.6 rd, 91.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.7 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(6.9) write-amplify(3.2) OK, records in: 7294, records dropped: 1056 output_compression: NoCompression
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.063444) EVENT_LOG_v1 {"time_micros": 1768919928063434, "job": 38, "event": "compaction_finished", "compaction_time_micros": 130624, "compaction_time_cpu_micros": 25289, "output_level": 6, "num_output_files": 1, "total_output_size": 11936636, "num_input_records": 7294, "num_output_records": 6238, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928064568, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919928066698, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:47.930570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:38:48.066786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:38:48 np0005588919 nova_compute[225855]: 2026-01-20 14:38:48.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:48.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:49.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:50 np0005588919 nova_compute[225855]: 2026-01-20 14:38:50.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:38:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:38:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:51.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:52.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:53 np0005588919 nova_compute[225855]: 2026-01-20 14:38:53.326 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:53.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:54.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:55 np0005588919 nova_compute[225855]: 2026-01-20 14:38:55.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:38:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:55.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:38:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:38:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:56.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:38:57Z|00183|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 09:38:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:57.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:58 np0005588919 podman[250586]: 2026-01-20 14:38:58.018064239 +0000 UTC m=+0.060561541 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:38:58 np0005588919 nova_compute[225855]: 2026-01-20 14:38:58.327 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:38:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:38:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:38:58.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.665 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.666 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.692 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.797 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.798 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.805 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.805 225859 INFO nova.compute.claims [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:38:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:38:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:38:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:38:59.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:38:59 np0005588919 nova_compute[225855]: 2026-01-20 14:38:59.906 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/616156782' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.384 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.391 225859 DEBUG nova.compute.provider_tree [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.420 225859 DEBUG nova.scheduler.client.report [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.444 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.445 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.502 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.503 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.523 225859 INFO nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.544 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.658 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.660 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.661 225859 INFO nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Creating image(s)#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.699 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.730 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.758 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.761 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.838 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.839 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.839 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.840 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.863 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:00 np0005588919 nova_compute[225855]: 2026-01-20 14:39:00.866 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:00.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.190 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.324s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.262 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] resizing rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.372 225859 DEBUG nova.policy [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ddee6eb6c32d451ca50c9ea499a23c1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4dadd5f5212f432693d35e765126f4df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.379 225859 DEBUG nova.objects.instance [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lazy-loading 'migration_context' on Instance uuid c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.392 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.393 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Ensure instance console log exists: /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.393 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.394 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.394 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:01.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.888 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.922 225859 WARNING nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.923 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:39:01 np0005588919 nova_compute[225855]: 2026-01-20 14:39:01.924 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:02 np0005588919 nova_compute[225855]: 2026-01-20 14:39:02.433 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Successfully created port: 722de795-61c5-4a11-ade3-6c19621e1054 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.462 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Successfully updated port: 722de795-61c5-4a11-ade3-6c19621e1054 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.482 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.482 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquired lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.482 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.554 225859 DEBUG nova.compute.manager [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-changed-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.555 225859 DEBUG nova.compute.manager [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Refreshing instance network info cache due to event network-changed-722de795-61c5-4a11-ade3-6c19621e1054. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.556 225859 DEBUG oslo_concurrency.lockutils [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:03 np0005588919 nova_compute[225855]: 2026-01-20 14:39:03.635 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:03.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.382 225859 DEBUG nova.network.neutron [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updating instance_info_cache with network_info: [{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.400 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Releasing lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.400 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Instance network_info: |[{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.401 225859 DEBUG oslo_concurrency.lockutils [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.401 225859 DEBUG nova.network.neutron [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Refreshing network info cache for port 722de795-61c5-4a11-ade3-6c19621e1054 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.406 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Start _get_guest_xml network_info=[{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.412 225859 WARNING nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.419 225859 DEBUG nova.virt.libvirt.host [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.420 225859 DEBUG nova.virt.libvirt.host [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.428 225859 DEBUG nova.virt.libvirt.host [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.429 225859 DEBUG nova.virt.libvirt.host [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.431 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.432 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.432 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.433 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.433 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.434 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.434 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.434 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.435 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.435 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.436 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.436 225859 DEBUG nova.virt.hardware [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.441 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3372781718' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.925 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.953 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:04 np0005588919 nova_compute[225855]: 2026-01-20 14:39:04.957 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2327636316' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.385 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.387 225859 DEBUG nova.virt.libvirt.vif [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1848412501',display_name='tempest-ServersTestManualDisk-server-1848412501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1848412501',id=63,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1H3ohoasKNqW9/qCu5grGBs3IV04fg9gxxB4grih5Zd5WxPzj2gaQyovrov9cUlcTcdLXAKoF+QUCFPVVxhI1Y4NXPI0qz/O7wrYwAYL2Je6ImmzeATRgxmFMwN+zj/A==',key_name='tempest-keypair-1139962663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4dadd5f5212f432693d35e765126f4df',ramdisk_id='',reservation_id='r-l9tyz3ue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-665340674',owner_user_name='tempest-ServersTestManualDisk-665340674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ddee6eb6c32d451ca50c9ea499a23c1a',uuid=c12d0bd2-ff69-4827-a5a0-8bf5e44094f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.388 225859 DEBUG nova.network.os_vif_util [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converting VIF {"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.389 225859 DEBUG nova.network.os_vif_util [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.390 225859 DEBUG nova.objects.instance [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lazy-loading 'pci_devices' on Instance uuid c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.408 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <uuid>c12d0bd2-ff69-4827-a5a0-8bf5e44094f7</uuid>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <name>instance-0000003f</name>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersTestManualDisk-server-1848412501</nova:name>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:39:04</nova:creationTime>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:user uuid="ddee6eb6c32d451ca50c9ea499a23c1a">tempest-ServersTestManualDisk-665340674-project-member</nova:user>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:project uuid="4dadd5f5212f432693d35e765126f4df">tempest-ServersTestManualDisk-665340674</nova:project>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <nova:port uuid="722de795-61c5-4a11-ade3-6c19621e1054">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="serial">c12d0bd2-ff69-4827-a5a0-8bf5e44094f7</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="uuid">c12d0bd2-ff69-4827-a5a0-8bf5e44094f7</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:be:4b:da"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <target dev="tap722de795-61"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/console.log" append="off"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:39:05 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:39:05 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:39:05 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:39:05 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.410 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Preparing to wait for external event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.410 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.410 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.410 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.411 225859 DEBUG nova.virt.libvirt.vif [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1848412501',display_name='tempest-ServersTestManualDisk-server-1848412501',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1848412501',id=63,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1H3ohoasKNqW9/qCu5grGBs3IV04fg9gxxB4grih5Zd5WxPzj2gaQyovrov9cUlcTcdLXAKoF+QUCFPVVxhI1Y4NXPI0qz/O7wrYwAYL2Je6ImmzeATRgxmFMwN+zj/A==',key_name='tempest-keypair-1139962663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4dadd5f5212f432693d35e765126f4df',ramdisk_id='',reservation_id='r-l9tyz3ue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-665340674',owner_user_name='tempest-ServersTestManualDisk-665340674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ddee6eb6c32d451ca50c9ea499a23c1a',uuid=c12d0bd2-ff69-4827-a5a0-8bf5e44094f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.411 225859 DEBUG nova.network.os_vif_util [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converting VIF {"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.412 225859 DEBUG nova.network.os_vif_util [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.412 225859 DEBUG os_vif [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.413 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.413 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.416 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap722de795-61, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.417 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap722de795-61, col_values=(('external_ids', {'iface-id': '722de795-61c5-4a11-ade3-6c19621e1054', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:4b:da', 'vm-uuid': 'c12d0bd2-ff69-4827-a5a0-8bf5e44094f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.418 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:05 np0005588919 NetworkManager[49104]: <info>  [1768919945.4190] manager: (tap722de795-61): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.423 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.425 225859 INFO os_vif [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61')#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.465 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.465 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.465 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] No VIF found with MAC fa:16:3e:be:4b:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.466 225859 INFO nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Using config drive#033[00m
Jan 20 09:39:05 np0005588919 nova_compute[225855]: 2026-01-20 14:39:05.487 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:05.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.327 225859 DEBUG nova.network.neutron [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updated VIF entry in instance network info cache for port 722de795-61c5-4a11-ade3-6c19621e1054. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.328 225859 DEBUG nova.network.neutron [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updating instance_info_cache with network_info: [{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.353 225859 INFO nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Creating config drive at /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.364 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_kelpup execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.393 225859 DEBUG oslo_concurrency.lockutils [req-a13f5730-2c44-4baf-8ad3-87ea95078a0b req-638fa99b-ce39-4b86-a40e-f22791c5e373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.498 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_kelpup" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.530 225859 DEBUG nova.storage.rbd_utils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] rbd image c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.534 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.678 225859 DEBUG oslo_concurrency.processutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.679 225859 INFO nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Deleting local config drive /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7/disk.config because it was imported into RBD.#033[00m
Jan 20 09:39:06 np0005588919 kernel: tap722de795-61: entered promiscuous mode
Jan 20 09:39:06 np0005588919 NetworkManager[49104]: <info>  [1768919946.7339] manager: (tap722de795-61): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 20 09:39:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:06Z|00184|binding|INFO|Claiming lport 722de795-61c5-4a11-ade3-6c19621e1054 for this chassis.
Jan 20 09:39:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:06Z|00185|binding|INFO|722de795-61c5-4a11-ade3-6c19621e1054: Claiming fa:16:3e:be:4b:da 10.100.0.4
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.738 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.740 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.751 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:4b:da 10.100.0.4'], port_security=['fa:16:3e:be:4b:da 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c12d0bd2-ff69-4827-a5a0-8bf5e44094f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4dadd5f5212f432693d35e765126f4df', 'neutron:revision_number': '2', 'neutron:security_group_ids': '333884db-2591-4fb4-b140-3b52543605e8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b4a1cca-fd45-4fc9-bc45-b55a7f22b84a, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=722de795-61c5-4a11-ade3-6c19621e1054) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.754 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 722de795-61c5-4a11-ade3-6c19621e1054 in datapath c5a9008d-9eea-43f2-a495-bf2e645a81fb bound to our chassis#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.758 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c5a9008d-9eea-43f2-a495-bf2e645a81fb#033[00m
Jan 20 09:39:06 np0005588919 systemd-machined[194361]: New machine qemu-28-instance-0000003f.
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.767 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[84c95e56-e503-4026-a321-e0faf72904d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.769 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc5a9008d-91 in ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:39:06 np0005588919 systemd-udevd[250984]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.771 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc5a9008d-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.771 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[976e543c-60f2-4f58-81ed-77f3c7fad264]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.772 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca2c753-b802-46a9-a798-35555242802f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 NetworkManager[49104]: <info>  [1768919946.7802] device (tap722de795-61): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:06 np0005588919 NetworkManager[49104]: <info>  [1768919946.7811] device (tap722de795-61): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.783 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[c76002d9-93be-4988-91d5-9ca4d622bb56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.803 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:06 np0005588919 systemd[1]: Started Virtual Machine qemu-28-instance-0000003f.
Jan 20 09:39:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:06Z|00186|binding|INFO|Setting lport 722de795-61c5-4a11-ade3-6c19621e1054 ovn-installed in OVS
Jan 20 09:39:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:06Z|00187|binding|INFO|Setting lport 722de795-61c5-4a11-ade3-6c19621e1054 up in Southbound
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.807 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ea492631-f7d0-46cd-8362-ac13bdce486e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 nova_compute[225855]: 2026-01-20 14:39:06.808 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.835 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9840bbcf-2b57-46e7-987f-8bf035d44d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.840 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d303b835-bc0d-428a-82e8-364afab305b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 systemd-udevd[250988]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:06 np0005588919 NetworkManager[49104]: <info>  [1768919946.8411] manager: (tapc5a9008d-90): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.869 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e33ad1ac-bdf1-4b8e-ad0d-d056653c8b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.872 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0404ea2c-ad28-4755-9064-17a2f34fb7b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 NetworkManager[49104]: <info>  [1768919946.8943] device (tapc5a9008d-90): carrier: link connected
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.899 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f93ba5-66e1-4175-899d-cf9f6b4daaa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.914 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0cd223-22d8-493b-b334-932e2e7c31b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc5a9008d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:2d:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498582, 'reachable_time': 19181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251017, 'error': None, 'target': 'ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.927 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51fa788e-3843-4a44-bf1c-c61e5c275485]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefa:2dfa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498582, 'tstamp': 498582}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251018, 'error': None, 'target': 'ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.943 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e22893c6-b2aa-4e0e-8c4e-cf074020d2a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc5a9008d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:2d:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498582, 'reachable_time': 19181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251019, 'error': None, 'target': 'ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:06.966 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6510ed5b-5061-41a1-9527-330ec295c697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.018 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ce57d9d2-3988-41f5-a168-913e4b822915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.020 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5a9008d-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.020 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.020 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5a9008d-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:07 np0005588919 NetworkManager[49104]: <info>  [1768919947.0226] manager: (tapc5a9008d-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 20 09:39:07 np0005588919 kernel: tapc5a9008d-90: entered promiscuous mode
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.024 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc5a9008d-90, col_values=(('external_ids', {'iface-id': 'f398fb65-c4f7-4041-baf6-a23646124813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:07Z|00188|binding|INFO|Releasing lport f398fb65-c4f7-4041-baf6-a23646124813 from this chassis (sb_readonly=0)
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.039 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.040 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c5a9008d-9eea-43f2-a495-bf2e645a81fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c5a9008d-9eea-43f2-a495-bf2e645a81fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.041 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[22bbd639-aaf4-4b81-860d-482241bf5aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.042 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-c5a9008d-9eea-43f2-a495-bf2e645a81fb
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/c5a9008d-9eea-43f2-a495-bf2e645a81fb.pid.haproxy
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID c5a9008d-9eea-43f2-a495-bf2e645a81fb
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:39:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:07.046 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'env', 'PROCESS_TAG=haproxy-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c5a9008d-9eea-43f2-a495-bf2e645a81fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:39:07 np0005588919 podman[251051]: 2026-01-20 14:39:07.435988424 +0000 UTC m=+0.052041920 container create 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.446 225859 DEBUG nova.compute.manager [req-e889511f-ece4-45fe-ba9d-2e048063f26e req-0af0c0bf-3b31-4b56-bb79-af67e26d7b32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.446 225859 DEBUG oslo_concurrency.lockutils [req-e889511f-ece4-45fe-ba9d-2e048063f26e req-0af0c0bf-3b31-4b56-bb79-af67e26d7b32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.447 225859 DEBUG oslo_concurrency.lockutils [req-e889511f-ece4-45fe-ba9d-2e048063f26e req-0af0c0bf-3b31-4b56-bb79-af67e26d7b32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.447 225859 DEBUG oslo_concurrency.lockutils [req-e889511f-ece4-45fe-ba9d-2e048063f26e req-0af0c0bf-3b31-4b56-bb79-af67e26d7b32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.447 225859 DEBUG nova.compute.manager [req-e889511f-ece4-45fe-ba9d-2e048063f26e req-0af0c0bf-3b31-4b56-bb79-af67e26d7b32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Processing event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:07 np0005588919 systemd[1]: Started libpod-conmon-940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809.scope.
Jan 20 09:39:07 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:39:07 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ced956dcc0e0a49545b20b665d5cc72b5ad48cfb671bd1d0627044fdd6837f48/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:39:07 np0005588919 podman[251051]: 2026-01-20 14:39:07.407409307 +0000 UTC m=+0.023462803 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:39:07 np0005588919 podman[251051]: 2026-01-20 14:39:07.515748295 +0000 UTC m=+0.131801811 container init 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:39:07 np0005588919 podman[251051]: 2026-01-20 14:39:07.520387596 +0000 UTC m=+0.136441082 container start 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 09:39:07 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [NOTICE]   (251112) : New worker (251116) forked
Jan 20 09:39:07 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [NOTICE]   (251112) : Loading success.
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.588 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.590 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919947.5888853, c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.590 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] VM Started (Lifecycle Event)#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.592 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.596 225859 INFO nova.virt.libvirt.driver [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Instance spawned successfully.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.596 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.619 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.624 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.629 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.630 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.630 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.631 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.631 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.632 225859 DEBUG nova.virt.libvirt.driver [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.660 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.661 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919947.5891743, c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.661 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.704 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.710 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919947.5918474, c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.710 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.746 225859 INFO nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Took 7.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.746 225859 DEBUG nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.748 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.755 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.796 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.829 225859 INFO nova.compute.manager [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Took 8.07 seconds to build instance.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.845 225859 DEBUG oslo_concurrency.lockutils [None req-d8f41aaa-9211-42c1-843f-fe24927da5f5 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.847 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 5.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.847 225859 INFO nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:07 np0005588919 nova_compute[225855]: 2026-01-20 14:39:07.847 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:07.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.132792) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948132832, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 251, "total_data_size": 527068, "memory_usage": 535360, "flush_reason": "Manual Compaction"}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948149701, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 285520, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36189, "largest_seqno": 36643, "table_properties": {"data_size": 283148, "index_size": 472, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6627, "raw_average_key_size": 20, "raw_value_size": 278274, "raw_average_value_size": 856, "num_data_blocks": 21, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919928, "oldest_key_time": 1768919928, "file_creation_time": 1768919948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 16950 microseconds, and 2510 cpu microseconds.
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.149738) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 285520 bytes OK
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.149758) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183298) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183339) EVENT_LOG_v1 {"time_micros": 1768919948183328, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183366) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 524240, prev total WAL file size 524240, number of live WAL files 2.
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183921) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303032' seq:72057594037927935, type:22 .. '6D6772737461740031323534' seq:0, type:0; will stop at (end)
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(278KB)], [66(11MB)]
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948183951, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 12222156, "oldest_snapshot_seqno": -1}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6056 keys, 8440354 bytes, temperature: kUnknown
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948291051, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8440354, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8400921, "index_size": 23195, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 155690, "raw_average_key_size": 25, "raw_value_size": 8293075, "raw_average_value_size": 1369, "num_data_blocks": 926, "num_entries": 6056, "num_filter_entries": 6056, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768919948, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.291263) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8440354 bytes
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.412925) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.1 rd, 78.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.4 +0.0 blob) out(8.0 +0.0 blob), read-write-amplify(72.4) write-amplify(29.6) OK, records in: 6563, records dropped: 507 output_compression: NoCompression
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.412990) EVENT_LOG_v1 {"time_micros": 1768919948412966, "job": 40, "event": "compaction_finished", "compaction_time_micros": 107163, "compaction_time_cpu_micros": 38022, "output_level": 6, "num_output_files": 1, "total_output_size": 8440354, "num_input_records": 6563, "num_output_records": 6056, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948413427, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768919948417772, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.183831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.417832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.417839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.417841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.417843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:39:08.417845) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:39:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:08.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.542 225859 DEBUG nova.compute.manager [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.543 225859 DEBUG oslo_concurrency.lockutils [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.543 225859 DEBUG oslo_concurrency.lockutils [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.543 225859 DEBUG oslo_concurrency.lockutils [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.544 225859 DEBUG nova.compute.manager [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] No waiting events found dispatching network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.544 225859 WARNING nova.compute.manager [req-b11ea75b-9b5c-4d46-8fd4-897161483b4b req-ebadc5b0-918c-4248-b1e0-fa1a0d134c52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received unexpected event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:09 np0005588919 NetworkManager[49104]: <info>  [1768919949.5682] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 20 09:39:09 np0005588919 NetworkManager[49104]: <info>  [1768919949.5698] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.567 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:09 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:09Z|00189|binding|INFO|Releasing lport f398fb65-c4f7-4041-baf6-a23646124813 from this chassis (sb_readonly=0)
Jan 20 09:39:09 np0005588919 nova_compute[225855]: 2026-01-20 14:39:09.798 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:09.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:10 np0005588919 nova_compute[225855]: 2026-01-20 14:39:10.052 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:10 np0005588919 nova_compute[225855]: 2026-01-20 14:39:10.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:10.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:11 np0005588919 nova_compute[225855]: 2026-01-20 14:39:11.686 225859 DEBUG nova.compute.manager [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-changed-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:11 np0005588919 nova_compute[225855]: 2026-01-20 14:39:11.687 225859 DEBUG nova.compute.manager [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Refreshing instance network info cache due to event network-changed-722de795-61c5-4a11-ade3-6c19621e1054. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:11 np0005588919 nova_compute[225855]: 2026-01-20 14:39:11.687 225859 DEBUG oslo_concurrency.lockutils [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:11 np0005588919 nova_compute[225855]: 2026-01-20 14:39:11.687 225859 DEBUG oslo_concurrency.lockutils [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:11 np0005588919 nova_compute[225855]: 2026-01-20 14:39:11.688 225859 DEBUG nova.network.neutron [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Refreshing network info cache for port 722de795-61c5-4a11-ade3-6c19621e1054 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:11.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:12.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:13 np0005588919 nova_compute[225855]: 2026-01-20 14:39:13.203 225859 DEBUG nova.network.neutron [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updated VIF entry in instance network info cache for port 722de795-61c5-4a11-ade3-6c19621e1054. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:13 np0005588919 nova_compute[225855]: 2026-01-20 14:39:13.211 225859 DEBUG nova.network.neutron [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updating instance_info_cache with network_info: [{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:13 np0005588919 nova_compute[225855]: 2026-01-20 14:39:13.242 225859 DEBUG oslo_concurrency.lockutils [req-597f19f7-5dd2-493f-a826-3d3a02e718a7 req-49ec243d-2387-49f9-9ba5-04ba91dd1c72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:13.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:14.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:15 np0005588919 nova_compute[225855]: 2026-01-20 14:39:15.055 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:15 np0005588919 nova_compute[225855]: 2026-01-20 14:39:15.459 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:15.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:16.399 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:16.400 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:16.401 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:16.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:17.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:18 np0005588919 podman[251132]: 2026-01-20 14:39:18.052191511 +0000 UTC m=+0.085704370 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:39:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:18.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:19.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:20 np0005588919 nova_compute[225855]: 2026-01-20 14:39:20.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:20Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:4b:da 10.100.0.4
Jan 20 09:39:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:20Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:4b:da 10.100.0.4
Jan 20 09:39:20 np0005588919 nova_compute[225855]: 2026-01-20 14:39:20.509 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:21.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:22 np0005588919 nova_compute[225855]: 2026-01-20 14:39:22.375 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:22.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:39:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:39:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:23.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:39:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:39:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:24.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:25 np0005588919 nova_compute[225855]: 2026-01-20 14:39:25.056 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:25 np0005588919 nova_compute[225855]: 2026-01-20 14:39:25.554 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:25.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:26 np0005588919 nova_compute[225855]: 2026-01-20 14:39:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:26 np0005588919 nova_compute[225855]: 2026-01-20 14:39:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:26 np0005588919 nova_compute[225855]: 2026-01-20 14:39:26.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:26 np0005588919 nova_compute[225855]: 2026-01-20 14:39:26.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:39:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:26.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:27 np0005588919 nova_compute[225855]: 2026-01-20 14:39:27.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:27 np0005588919 nova_compute[225855]: 2026-01-20 14:39:27.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:39:27 np0005588919 nova_compute[225855]: 2026-01-20 14:39:27.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:39:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:27.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.274 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.274 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.274 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.274 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.687 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.687 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.688 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.688 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.688 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.689 225859 INFO nova.compute.manager [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Terminating instance#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.691 225859 DEBUG nova.compute.manager [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:39:28 np0005588919 kernel: tap722de795-61 (unregistering): left promiscuous mode
Jan 20 09:39:28 np0005588919 NetworkManager[49104]: <info>  [1768919968.7992] device (tap722de795-61): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.849 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:28Z|00190|binding|INFO|Releasing lport 722de795-61c5-4a11-ade3-6c19621e1054 from this chassis (sb_readonly=0)
Jan 20 09:39:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:28Z|00191|binding|INFO|Setting lport 722de795-61c5-4a11-ade3-6c19621e1054 down in Southbound
Jan 20 09:39:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:28Z|00192|binding|INFO|Removing iface tap722de795-61 ovn-installed in OVS
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.852 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:28.870 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:4b:da 10.100.0.4'], port_security=['fa:16:3e:be:4b:da 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c12d0bd2-ff69-4827-a5a0-8bf5e44094f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4dadd5f5212f432693d35e765126f4df', 'neutron:revision_number': '4', 'neutron:security_group_ids': '333884db-2591-4fb4-b140-3b52543605e8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b4a1cca-fd45-4fc9-bc45-b55a7f22b84a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=722de795-61c5-4a11-ade3-6c19621e1054) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:28.872 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 722de795-61c5-4a11-ade3-6c19621e1054 in datapath c5a9008d-9eea-43f2-a495-bf2e645a81fb unbound from our chassis#033[00m
Jan 20 09:39:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:28.873 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5a9008d-9eea-43f2-a495-bf2e645a81fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:39:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:28.874 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[98d30263-ee8a-44bf-ac37-1043477b9bbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:28.875 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb namespace which is not needed anymore#033[00m
Jan 20 09:39:28 np0005588919 nova_compute[225855]: 2026-01-20 14:39:28.884 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:28 np0005588919 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 20 09:39:28 np0005588919 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003f.scope: Consumed 13.404s CPU time.
Jan 20 09:39:28 np0005588919 systemd-machined[194361]: Machine qemu-28-instance-0000003f terminated.
Jan 20 09:39:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:28.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:28 np0005588919 podman[251468]: 2026-01-20 14:39:28.963930164 +0000 UTC m=+0.078530475 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [NOTICE]   (251112) : haproxy version is 2.8.14-c23fe91
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [NOTICE]   (251112) : path to executable is /usr/sbin/haproxy
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [WARNING]  (251112) : Exiting Master process...
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [WARNING]  (251112) : Exiting Master process...
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [ALERT]    (251112) : Current worker (251116) exited with code 143 (Terminated)
Jan 20 09:39:28 np0005588919 neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb[251107]: [WARNING]  (251112) : All workers exited. Exiting... (0)
Jan 20 09:39:28 np0005588919 systemd[1]: libpod-940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809.scope: Deactivated successfully.
Jan 20 09:39:29 np0005588919 podman[251510]: 2026-01-20 14:39:29.005784069 +0000 UTC m=+0.043591326 container died 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:39:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809-userdata-shm.mount: Deactivated successfully.
Jan 20 09:39:29 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ced956dcc0e0a49545b20b665d5cc72b5ad48cfb671bd1d0627044fdd6837f48-merged.mount: Deactivated successfully.
Jan 20 09:39:29 np0005588919 podman[251510]: 2026-01-20 14:39:29.042719265 +0000 UTC m=+0.080526522 container cleanup 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:39:29 np0005588919 systemd[1]: libpod-conmon-940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809.scope: Deactivated successfully.
Jan 20 09:39:29 np0005588919 podman[251541]: 2026-01-20 14:39:29.102660613 +0000 UTC m=+0.040498258 container remove 940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.110 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[54423d33-90ab-4c7a-b4a1-6a687ae99e5a]: (4, ('Tue Jan 20 02:39:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb (940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809)\n940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809\nTue Jan 20 02:39:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb (940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809)\n940eb6ae8742c65c6962999621b5cbe09e821ea1435acfd2e7c484c1e998c809\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.112 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4871bb-e188-4b00-b11e-6b6e519fa423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.113 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5a9008d-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.132 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 kernel: tapc5a9008d-90: left promiscuous mode
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.133 225859 INFO nova.virt.libvirt.driver [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Instance destroyed successfully.#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.133 225859 DEBUG nova.objects.instance [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lazy-loading 'resources' on Instance uuid c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.137 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.140 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d28e9fa4-31bb-4420-a0f3-2f1529c2f1cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.147 225859 DEBUG nova.virt.libvirt.vif [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1848412501',display_name='tempest-ServersTestManualDisk-server-1848412501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1848412501',id=63,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1H3ohoasKNqW9/qCu5grGBs3IV04fg9gxxB4grih5Zd5WxPzj2gaQyovrov9cUlcTcdLXAKoF+QUCFPVVxhI1Y4NXPI0qz/O7wrYwAYL2Je6ImmzeATRgxmFMwN+zj/A==',key_name='tempest-keypair-1139962663',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4dadd5f5212f432693d35e765126f4df',ramdisk_id='',reservation_id='r-l9tyz3ue',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-665340674',owner_user_name='tempest-ServersTestManualDisk-665340674-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ddee6eb6c32d451ca50c9ea499a23c1a',uuid=c12d0bd2-ff69-4827-a5a0-8bf5e44094f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.148 225859 DEBUG nova.network.os_vif_util [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converting VIF {"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.149 225859 DEBUG nova.network.os_vif_util [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.149 225859 DEBUG os_vif [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.152 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.152 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap722de795-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.154 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.155 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d15dcefd-4d3d-4229-9d9e-0c242e4e8b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.156 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.156 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aad12c12-1571-4a45-8d28-68dcbc697565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.157 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.161 225859 INFO os_vif [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:4b:da,bridge_name='br-int',has_traffic_filtering=True,id=722de795-61c5-4a11-ade3-6c19621e1054,network=Network(c5a9008d-9eea-43f2-a495-bf2e645a81fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722de795-61')#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.172 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d84890d-9f8c-4c37-a145-aa4f0b57330c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498575, 'reachable_time': 35908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251571, 'error': None, 'target': 'ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 systemd[1]: run-netns-ovnmeta\x2dc5a9008d\x2d9eea\x2d43f2\x2da495\x2dbf2e645a81fb.mount: Deactivated successfully.
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.176 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c5a9008d-9eea-43f2-a495-bf2e645a81fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:39:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:29.176 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3c907e-3157-413a-b6e5-e892c16d526c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.541 225859 INFO nova.virt.libvirt.driver [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Deleting instance files /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_del#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.542 225859 INFO nova.virt.libvirt.driver [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Deletion of /var/lib/nova/instances/c12d0bd2-ff69-4827-a5a0-8bf5e44094f7_del complete#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.596 225859 INFO nova.compute.manager [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.597 225859 DEBUG oslo.service.loopingcall [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.598 225859 DEBUG nova.compute.manager [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.598 225859 DEBUG nova.network.neutron [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:39:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:39:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:29.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.918 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updating instance_info_cache with network_info: [{"id": "722de795-61c5-4a11-ade3-6c19621e1054", "address": "fa:16:3e:be:4b:da", "network": {"id": "c5a9008d-9eea-43f2-a495-bf2e645a81fb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2127234184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4dadd5f5212f432693d35e765126f4df", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722de795-61", "ovs_interfaceid": "722de795-61c5-4a11-ade3-6c19621e1054", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.936 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.937 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.937 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.937 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.940 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.961 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.961 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.962 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.962 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:39:29 np0005588919 nova_compute[225855]: 2026-01-20 14:39:29.963 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/837437866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.403 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.570 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.571 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4593MB free_disk=20.83075714111328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.572 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.572 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.649 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.650 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.650 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.668 225859 DEBUG nova.network.neutron [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.696 225859 INFO nova.compute.manager [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Took 1.10 seconds to deallocate network for instance.#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.736 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.772 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:30 np0005588919 nova_compute[225855]: 2026-01-20 14:39:30.808 225859 DEBUG nova.compute.manager [req-70eb112b-a7c4-414a-9961-505215ac1462 req-dfcc9e80-7dae-4df3-86c7-2d4281107499 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-vif-deleted-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:30.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:31 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/219743222' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.138 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.144 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.163 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.193 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.193 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.193 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.231 225859 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-vif-unplugged-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.231 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.231 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.231 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.232 225859 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] No waiting events found dispatching network-vif-unplugged-722de795-61c5-4a11-ade3-6c19621e1054 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.232 225859 WARNING nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received unexpected event network-vif-unplugged-722de795-61c5-4a11-ade3-6c19621e1054 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.232 225859 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.232 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.232 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.233 225859 DEBUG oslo_concurrency.lockutils [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.233 225859 DEBUG nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] No waiting events found dispatching network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.233 225859 WARNING nova.compute.manager [req-5567d833-541e-4ddc-9254-13c63f724321 req-2f09011f-d27c-4140-a389-8f7136e7da16 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Received unexpected event network-vif-plugged-722de795-61c5-4a11-ade3-6c19621e1054 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.250 225859 DEBUG oslo_concurrency.processutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.670 225859 DEBUG oslo_concurrency.processutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.676 225859 DEBUG nova.compute.provider_tree [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.695 225859 DEBUG nova.scheduler.client.report [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.719 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:31 np0005588919 nova_compute[225855]: 2026-01-20 14:39:31.753 225859 INFO nova.scheduler.client.report [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Deleted allocations for instance c12d0bd2-ff69-4827-a5a0-8bf5e44094f7#033[00m
Jan 20 09:39:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:31.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:32 np0005588919 nova_compute[225855]: 2026-01-20 14:39:32.022 225859 DEBUG oslo_concurrency.lockutils [None req-b4b8d585-b3c9-4471-ae80-c1c039e1eef1 ddee6eb6c32d451ca50c9ea499a23c1a 4dadd5f5212f432693d35e765126f4df - - default default] Lock "c12d0bd2-ff69-4827-a5a0-8bf5e44094f7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:32.079 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:32 np0005588919 nova_compute[225855]: 2026-01-20 14:39:32.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:32.082 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:39:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:32.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:33.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:34 np0005588919 nova_compute[225855]: 2026-01-20 14:39:34.154 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:34.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:35 np0005588919 nova_compute[225855]: 2026-01-20 14:39:35.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:35 np0005588919 nova_compute[225855]: 2026-01-20 14:39:35.189 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:39:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:35.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:36.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:37.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:38.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:39.084 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:39 np0005588919 nova_compute[225855]: 2026-01-20 14:39:39.155 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:39.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:40 np0005588919 nova_compute[225855]: 2026-01-20 14:39:40.063 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588919 nova_compute[225855]: 2026-01-20 14:39:40.377 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588919 nova_compute[225855]: 2026-01-20 14:39:40.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:39:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:40.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:39:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:41.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:42.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.496 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.496 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.526 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.604 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.605 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.611 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.611 225859 INFO nova.compute.claims [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:39:43 np0005588919 nova_compute[225855]: 2026-01-20 14:39:43.708 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:43.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:39:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1901219903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.129 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768919969.128188, c12d0bd2-ff69-4827-a5a0-8bf5e44094f7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.130 225859 INFO nova.compute.manager [-] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.145 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.150 225859 DEBUG nova.compute.provider_tree [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.157 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.730 225859 DEBUG nova.compute.manager [None req-00d7702b-bd2e-42a2-bfb2-eb448dd896eb - - - - - -] [instance: c12d0bd2-ff69-4827-a5a0-8bf5e44094f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.732 225859 DEBUG nova.scheduler.client.report [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.831 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.832 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.891 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.892 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.947 225859 INFO nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:39:44 np0005588919 nova_compute[225855]: 2026-01-20 14:39:44.963 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:39:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:44.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.064 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.284 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.287 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.288 225859 INFO nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Creating image(s)#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.328 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.362 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.395 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.399 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.431 225859 DEBUG nova.policy [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.589 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.590 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.591 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.591 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.615 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.619 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d5c2df9d-748f-4df2-9392-b45741975f65_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:45.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:45 np0005588919 nova_compute[225855]: 2026-01-20 14:39:45.940 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d5c2df9d-748f-4df2-9392-b45741975f65_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.321s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.054 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] resizing rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.595 225859 DEBUG nova.objects.instance [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'migration_context' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.616 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.617 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Ensure instance console log exists: /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.618 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.618 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.619 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:46 np0005588919 nova_compute[225855]: 2026-01-20 14:39:46.910 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully created port: b48170b0-717d-48f0-8172-742a4a8596e9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:39:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:46.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.792 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully updated port: b48170b0-717d-48f0-8172-742a4a8596e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.812 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.812 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.812 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.912 225859 DEBUG nova.compute.manager [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-changed-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.912 225859 DEBUG nova.compute.manager [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing instance network info cache due to event network-changed-b48170b0-717d-48f0-8172-742a4a8596e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.913 225859 DEBUG oslo_concurrency.lockutils [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:47.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:47 np0005588919 nova_compute[225855]: 2026-01-20 14:39:47.981 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:39:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:48.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:49 np0005588919 podman[251957]: 2026-01-20 14:39:49.037449258 +0000 UTC m=+0.081546631 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:39:49 np0005588919 nova_compute[225855]: 2026-01-20 14:39:49.158 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:49.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.034 225859 DEBUG nova.network.neutron [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.050 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.051 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Instance network_info: |[{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.051 225859 DEBUG oslo_concurrency.lockutils [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.051 225859 DEBUG nova.network.neutron [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing network info cache for port b48170b0-717d-48f0-8172-742a4a8596e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.053 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Start _get_guest_xml network_info=[{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.057 225859 WARNING nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.062 225859 DEBUG nova.virt.libvirt.host [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.062 225859 DEBUG nova.virt.libvirt.host [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.065 225859 DEBUG nova.virt.libvirt.host [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.065 225859 DEBUG nova.virt.libvirt.host [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.066 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.066 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.067 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.068 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.068 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.068 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.068 225859 DEBUG nova.virt.hardware [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.070 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4224668871' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.479 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.508 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.512 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:39:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2535468181' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.970 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:50.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.973 225859 DEBUG nova.virt.libvirt.vif [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.974 225859 DEBUG nova.network.os_vif_util [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.975 225859 DEBUG nova.network.os_vif_util [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:50 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.977 225859 DEBUG nova.objects.instance [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_devices' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:50.999 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <uuid>d5c2df9d-748f-4df2-9392-b45741975f65</uuid>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <name>instance-00000042</name>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:39:50</nova:creationTime>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="serial">d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="uuid">d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk.config">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:3b:35:f2"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <target dev="tapb48170b0-71"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log" append="off"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:39:51 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:39:51 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:39:51 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:39:51 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.001 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Preparing to wait for external event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.001 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.001 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.002 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.002 225859 DEBUG nova.virt.libvirt.vif [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:39:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.003 225859 DEBUG nova.network.os_vif_util [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.003 225859 DEBUG nova.network.os_vif_util [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.004 225859 DEBUG os_vif [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.005 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.006 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.011 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb48170b0-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.012 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb48170b0-71, col_values=(('external_ids', {'iface-id': 'b48170b0-717d-48f0-8172-742a4a8596e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:35:f2', 'vm-uuid': 'd5c2df9d-748f-4df2-9392-b45741975f65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:51 np0005588919 NetworkManager[49104]: <info>  [1768919991.0553] manager: (tapb48170b0-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.062 225859 INFO os_vif [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71')#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.137 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.138 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.138 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:3b:35:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.139 225859 INFO nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Using config drive#033[00m
Jan 20 09:39:51 np0005588919 nova_compute[225855]: 2026-01-20 14:39:51.172 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:51.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:52 np0005588919 nova_compute[225855]: 2026-01-20 14:39:52.564 225859 INFO nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Creating config drive at /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config#033[00m
Jan 20 09:39:52 np0005588919 nova_compute[225855]: 2026-01-20 14:39:52.573 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc7a84mwk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:52 np0005588919 nova_compute[225855]: 2026-01-20 14:39:52.712 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc7a84mwk" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:52 np0005588919 nova_compute[225855]: 2026-01-20 14:39:52.742 225859 DEBUG nova.storage.rbd_utils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image d5c2df9d-748f-4df2-9392-b45741975f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:39:52 np0005588919 nova_compute[225855]: 2026-01-20 14:39:52.745 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config d5c2df9d-748f-4df2-9392-b45741975f65_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:39:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.103 225859 DEBUG oslo_concurrency.processutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config d5c2df9d-748f-4df2-9392-b45741975f65_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.104 225859 INFO nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Deleting local config drive /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/disk.config because it was imported into RBD.#033[00m
Jan 20 09:39:53 np0005588919 kernel: tapb48170b0-71: entered promiscuous mode
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.1734] manager: (tapb48170b0-71): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Jan 20 09:39:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:53Z|00193|binding|INFO|Claiming lport b48170b0-717d-48f0-8172-742a4a8596e9 for this chassis.
Jan 20 09:39:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:53Z|00194|binding|INFO|b48170b0-717d-48f0-8172-742a4a8596e9: Claiming fa:16:3e:3b:35:f2 10.100.0.13
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.174 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.179 225859 DEBUG nova.network.neutron [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated VIF entry in instance network info cache for port b48170b0-717d-48f0-8172-742a4a8596e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.180 225859 DEBUG nova.network.neutron [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.183 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.202 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:35:f2 10.100.0.13'], port_security=['fa:16:3e:3b:35:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b03ad0a9-4e4a-464d-b7d2-84d77d6554bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=b48170b0-717d-48f0-8172-742a4a8596e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.204 140354 INFO neutron.agent.ovn.metadata.agent [-] Port b48170b0-717d-48f0-8172-742a4a8596e9 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:39:53 np0005588919 systemd-udevd[252119]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.206 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:39:53 np0005588919 systemd-machined[194361]: New machine qemu-29-instance-00000042.
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.214 225859 DEBUG oslo_concurrency.lockutils [req-c581aa3d-73ba-44e7-aa5d-6b87cb19210d req-4b156c00-7c0c-43c2-8f0f-abb11063dbcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.2184] device (tapb48170b0-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.2196] device (tapb48170b0-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.219 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8a120482-2f70-4ea2-980a-a00c853da07c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.220 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc21b99b-41 in ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.222 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc21b99b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.222 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6795bf-5883-4c27-99fb-7e50539ffd47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.223 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd754d9-0603-4b68-876e-0db00d5a14b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 systemd[1]: Started Virtual Machine qemu-29-instance-00000042.
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.238 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[5b050288-b431-454a-ac37-1514e860c620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:53Z|00195|binding|INFO|Setting lport b48170b0-717d-48f0-8172-742a4a8596e9 ovn-installed in OVS
Jan 20 09:39:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:53Z|00196|binding|INFO|Setting lport b48170b0-717d-48f0-8172-742a4a8596e9 up in Southbound
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.295 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.308 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aed39129-127d-4ec6-9dd1-6b91268aa5d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.339 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3756f0c5-21bb-4eec-84ad-63f0c401b381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.345 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1e17b826-19dd-4086-b1d1-6516b75ce132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.3465] manager: (tapfc21b99b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.377 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[98fcb32b-3e1e-4f00-ad42-62f3aaa9f936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.381 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[74a5e366-f194-400c-9eb8-aae384e5d29e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.4109] device (tapfc21b99b-40): carrier: link connected
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.412 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[78c36fc3-3ce7-4e21-8f8d-236d23e1ad08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.428 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0a808e-ef11-4160-85a2-8c09b99fec49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 15488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252153, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.443 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4e52b0b1-4ee3-4319-ad94-2d424ec06b59]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:5bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503233, 'tstamp': 503233}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252154, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.462 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a7dde585-17be-4b71-956d-16f05c9d896f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 15488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252155, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.494 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5bbf51-4a3b-4bba-a65e-1df499d43df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.496 225859 DEBUG nova.compute.manager [req-20a05fac-2422-4b0a-a0af-0a7fbd614100 req-1250e884-0fb0-4ead-9c23-0ed937f394e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.497 225859 DEBUG oslo_concurrency.lockutils [req-20a05fac-2422-4b0a-a0af-0a7fbd614100 req-1250e884-0fb0-4ead-9c23-0ed937f394e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.497 225859 DEBUG oslo_concurrency.lockutils [req-20a05fac-2422-4b0a-a0af-0a7fbd614100 req-1250e884-0fb0-4ead-9c23-0ed937f394e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.497 225859 DEBUG oslo_concurrency.lockutils [req-20a05fac-2422-4b0a-a0af-0a7fbd614100 req-1250e884-0fb0-4ead-9c23-0ed937f394e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.498 225859 DEBUG nova.compute.manager [req-20a05fac-2422-4b0a-a0af-0a7fbd614100 req-1250e884-0fb0-4ead-9c23-0ed937f394e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Processing event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.567 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[083c7f52-1c8f-4779-a206-6a6220076016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.568 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.568 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.569 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.570 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 NetworkManager[49104]: <info>  [1768919993.5711] manager: (tapfc21b99b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 20 09:39:53 np0005588919 kernel: tapfc21b99b-40: entered promiscuous mode
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.574 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.573 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:53Z|00197|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.590 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.592 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.592 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf6dd0e-b96d-4336-ad73-8c8b5c627f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.593 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:39:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:39:53.594 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'env', 'PROCESS_TAG=haproxy-fc21b99b-4e34-422c-be05-0a440009dac4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc21b99b-4e34-422c-be05-0a440009dac4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.694 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.695 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919993.6939595, d5c2df9d-748f-4df2-9392-b45741975f65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.695 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] VM Started (Lifecycle Event)#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.701 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.704 225859 INFO nova.virt.libvirt.driver [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Instance spawned successfully.#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.705 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.713 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.716 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.723 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.724 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.724 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.725 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.725 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.726 225859 DEBUG nova.virt.libvirt.driver [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.758 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.759 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919993.6946414, d5c2df9d-748f-4df2-9392-b45741975f65 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.759 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.785 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.789 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768919993.700576, d5c2df9d-748f-4df2-9392-b45741975f65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.789 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.798 225859 INFO nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Took 8.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.798 225859 DEBUG nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.807 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:39:53 np0005588919 nova_compute[225855]: 2026-01-20 14:39:53.809 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:39:53 np0005588919 podman[252230]: 2026-01-20 14:39:53.942346769 +0000 UTC m=+0.053245419 container create ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 09:39:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:53.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:53 np0005588919 systemd[1]: Started libpod-conmon-ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48.scope.
Jan 20 09:39:54 np0005588919 podman[252230]: 2026-01-20 14:39:53.914655585 +0000 UTC m=+0.025554245 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:39:54 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:39:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1daf9cb422d92ee82699c7df2fb191ba24031863f4d9fd4cf2bcb41f474f37a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:39:54 np0005588919 podman[252230]: 2026-01-20 14:39:54.082510419 +0000 UTC m=+0.193409099 container init ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:39:54 np0005588919 podman[252230]: 2026-01-20 14:39:54.091490043 +0000 UTC m=+0.202388683 container start ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:39:54 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [NOTICE]   (252249) : New worker (252251) forked
Jan 20 09:39:54 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [NOTICE]   (252249) : Loading success.
Jan 20 09:39:54 np0005588919 nova_compute[225855]: 2026-01-20 14:39:54.170 225859 INFO nova.compute.manager [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Took 10.59 seconds to build instance.#033[00m
Jan 20 09:39:54 np0005588919 nova_compute[225855]: 2026-01-20 14:39:54.203 225859 DEBUG oslo_concurrency.lockutils [None req-bc9b2d72-ce4e-43cd-962e-9257da3c9038 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.067 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.621 225859 DEBUG nova.compute.manager [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.622 225859 DEBUG oslo_concurrency.lockutils [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.623 225859 DEBUG oslo_concurrency.lockutils [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.623 225859 DEBUG oslo_concurrency.lockutils [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.623 225859 DEBUG nova.compute.manager [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:39:55 np0005588919 nova_compute[225855]: 2026-01-20 14:39:55.624 225859 WARNING nova.compute.manager [req-815f991c-db2b-4a4d-b4c2-9b0bfeb10db6 req-2708a757-a18b-4e0d-945f-e9c7f4ee9504 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:39:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:39:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:55.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:56 np0005588919 nova_compute[225855]: 2026-01-20 14:39:56.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:56.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:39:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:57.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:39:58 np0005588919 nova_compute[225855]: 2026-01-20 14:39:58.541 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:58 np0005588919 NetworkManager[49104]: <info>  [1768919998.5473] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 20 09:39:58 np0005588919 NetworkManager[49104]: <info>  [1768919998.5504] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 20 09:39:58 np0005588919 nova_compute[225855]: 2026-01-20 14:39:58.700 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:39:58Z|00198|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:39:58 np0005588919 nova_compute[225855]: 2026-01-20 14:39:58.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:39:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:39:58.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:39:59 np0005588919 nova_compute[225855]: 2026-01-20 14:39:59.151 225859 DEBUG nova.compute.manager [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-changed-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:39:59 np0005588919 nova_compute[225855]: 2026-01-20 14:39:59.152 225859 DEBUG nova.compute.manager [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing instance network info cache due to event network-changed-b48170b0-717d-48f0-8172-742a4a8596e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:39:59 np0005588919 nova_compute[225855]: 2026-01-20 14:39:59.152 225859 DEBUG oslo_concurrency.lockutils [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:39:59 np0005588919 nova_compute[225855]: 2026-01-20 14:39:59.152 225859 DEBUG oslo_concurrency.lockutils [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:39:59 np0005588919 nova_compute[225855]: 2026-01-20 14:39:59.152 225859 DEBUG nova.network.neutron [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing network info cache for port b48170b0-717d-48f0-8172-742a4a8596e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:39:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:39:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:39:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:39:59.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:00 np0005588919 podman[252264]: 2026-01-20 14:40:00.008101832 +0000 UTC m=+0.056825251 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:40:00 np0005588919 nova_compute[225855]: 2026-01-20 14:40:00.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:00.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:01 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:40:01 np0005588919 nova_compute[225855]: 2026-01-20 14:40:01.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:01 np0005588919 nova_compute[225855]: 2026-01-20 14:40:01.264 225859 DEBUG nova.network.neutron [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated VIF entry in instance network info cache for port b48170b0-717d-48f0-8172-742a4a8596e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:01 np0005588919 nova_compute[225855]: 2026-01-20 14:40:01.265 225859 DEBUG nova.network.neutron [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:01 np0005588919 nova_compute[225855]: 2026-01-20 14:40:01.294 225859 DEBUG oslo_concurrency.lockutils [req-e6f14b1c-a0be-43e3-9ece-38c4b95ee32b req-6dd3ee31-83b2-401e-9264-719bc6bd8306 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:01.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:02.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:03.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:40:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:04.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:40:05 np0005588919 nova_compute[225855]: 2026-01-20 14:40:05.070 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:05 np0005588919 nova_compute[225855]: 2026-01-20 14:40:05.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:05.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:06 np0005588919 nova_compute[225855]: 2026-01-20 14:40:06.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:06.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:07Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:35:f2 10.100.0.13
Jan 20 09:40:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:07Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:35:f2 10.100.0.13
Jan 20 09:40:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:40:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:07.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:40:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:08.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:09 np0005588919 nova_compute[225855]: 2026-01-20 14:40:09.659 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:09.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:10 np0005588919 nova_compute[225855]: 2026-01-20 14:40:10.073 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:10.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:11 np0005588919 nova_compute[225855]: 2026-01-20 14:40:11.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:11.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:12.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:40:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1177646554' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:40:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:40:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1177646554' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:40:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:13.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.719 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.719 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.750 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.840 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.841 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.849 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.849 225859 INFO nova.compute.claims [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:40:14 np0005588919 nova_compute[225855]: 2026-01-20 14:40:14.983 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:14.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.076 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2441771306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.421 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.427 225859 DEBUG nova.compute.provider_tree [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.442 225859 DEBUG nova.scheduler.client.report [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.461 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.462 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.509 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.510 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.526 225859 INFO nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.542 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.621 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.622 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.623 225859 INFO nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Creating image(s)#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.654 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.684 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.713 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.717 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.757 225859 DEBUG nova.policy [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16c05e1ac16f428bab6b36346856235e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b50ce2f25e8943e28ddf8bf69c721e75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.784 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.785 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.786 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.786 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.811 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.815 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.963 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.964 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.964 225859 DEBUG nova.objects.instance [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:15.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:15 np0005588919 nova_compute[225855]: 2026-01-20 14:40:15.988 225859 DEBUG nova.objects.instance [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_requests' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.001 225859 DEBUG nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.151 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.221 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] resizing rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.326 225859 DEBUG nova.objects.instance [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lazy-loading 'migration_context' on Instance uuid a96ccadd-ac1d-4040-8bcc-bebb460ee233 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.352 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.353 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Ensure instance console log exists: /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.353 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.353 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.354 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.362 225859 DEBUG nova.policy [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:16.400 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:16.401 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:16.402 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:16 np0005588919 nova_compute[225855]: 2026-01-20 14:40:16.926 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Successfully created port: 19a89daa-770c-4c3f-970c-a9a462503b06 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:40:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:16.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:17 np0005588919 nova_compute[225855]: 2026-01-20 14:40:17.021 225859 DEBUG nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully created port: db46acd4-809b-4127-ad48-870ae429b4d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:40:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:17.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.029 225859 DEBUG nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully updated port: db46acd4-809b-4127-ad48-870ae429b4d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.059 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.060 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.060 225859 DEBUG nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.103 225859 DEBUG nova.compute.manager [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-changed-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.104 225859 DEBUG nova.compute.manager [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing instance network info cache due to event network-changed-db46acd4-809b-4127-ad48-870ae429b4d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.104 225859 DEBUG oslo_concurrency.lockutils [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.298 225859 WARNING nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.568 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Successfully updated port: 19a89daa-770c-4c3f-970c-a9a462503b06 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.595 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.595 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquired lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.595 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:40:18 np0005588919 nova_compute[225855]: 2026-01-20 14:40:18.840 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:40:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:19.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.760 225859 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-changed-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.760 225859 DEBUG nova.compute.manager [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Refreshing instance network info cache due to event network-changed-19a89daa-770c-4c3f-970c-a9a462503b06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.760 225859 DEBUG oslo_concurrency.lockutils [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.776 225859 DEBUG nova.network.neutron [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updating instance_info_cache with network_info: [{"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.803 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Releasing lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.803 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Instance network_info: |[{"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.804 225859 DEBUG oslo_concurrency.lockutils [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.804 225859 DEBUG nova.network.neutron [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Refreshing network info cache for port 19a89daa-770c-4c3f-970c-a9a462503b06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.807 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Start _get_guest_xml network_info=[{"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.811 225859 WARNING nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.815 225859 DEBUG nova.virt.libvirt.host [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.815 225859 DEBUG nova.virt.libvirt.host [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.819 225859 DEBUG nova.virt.libvirt.host [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.819 225859 DEBUG nova.virt.libvirt.host [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.820 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.820 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.821 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.821 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.822 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.822 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.822 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.822 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.823 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.823 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.824 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.824 225859 DEBUG nova.virt.hardware [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:40:19 np0005588919 nova_compute[225855]: 2026-01-20 14:40:19.827 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:19.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.051 225859 DEBUG nova.network.neutron [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.073 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.074 225859 DEBUG oslo_concurrency.lockutils [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.074 225859 DEBUG nova.network.neutron [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing network info cache for port db46acd4-809b-4127-ad48-870ae429b4d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.083 225859 DEBUG nova.virt.libvirt.vif [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.084 225859 DEBUG nova.network.os_vif_util [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.085 225859 DEBUG nova.network.os_vif_util [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.087 225859 DEBUG os_vif [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.088 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.089 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.093 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb46acd4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.094 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb46acd4-80, col_values=(('external_ids', {'iface-id': 'db46acd4-809b-4127-ad48-870ae429b4d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:df:88', 'vm-uuid': 'd5c2df9d-748f-4df2-9392-b45741975f65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.095 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 NetworkManager[49104]: <info>  [1768920020.0962] manager: (tapdb46acd4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.104 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.105 225859 INFO os_vif [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80')#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.105 225859 DEBUG nova.virt.libvirt.vif [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.106 225859 DEBUG nova.network.os_vif_util [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.106 225859 DEBUG nova.network.os_vif_util [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.109 225859 DEBUG nova.virt.libvirt.guest [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:24:df:88"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <target dev="tapdb46acd4-80"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:40:20 np0005588919 podman[252533]: 2026-01-20 14:40:20.113997781 +0000 UTC m=+0.140360406 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 09:40:20 np0005588919 kernel: tapdb46acd4-80: entered promiscuous mode
Jan 20 09:40:20 np0005588919 NetworkManager[49104]: <info>  [1768920020.1213] manager: (tapdb46acd4-80): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 20 09:40:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:20Z|00199|binding|INFO|Claiming lport db46acd4-809b-4127-ad48-870ae429b4d6 for this chassis.
Jan 20 09:40:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:20Z|00200|binding|INFO|db46acd4-809b-4127-ad48-870ae429b4d6: Claiming fa:16:3e:24:df:88 10.100.0.12
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.131 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:df:88 10.100.0.12'], port_security=['fa:16:3e:24:df:88 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=db46acd4-809b-4127-ad48-870ae429b4d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.132 140354 INFO neutron.agent.ovn.metadata.agent [-] Port db46acd4-809b-4127-ad48-870ae429b4d6 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.134 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:20Z|00201|binding|INFO|Setting lport db46acd4-809b-4127-ad48-870ae429b4d6 ovn-installed in OVS
Jan 20 09:40:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:20Z|00202|binding|INFO|Setting lport db46acd4-809b-4127-ad48-870ae429b4d6 up in Southbound
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.142 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.163 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2cf441-3877-49ac-9507-4c443d1b940f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 systemd-udevd[252585]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:40:20 np0005588919 NetworkManager[49104]: <info>  [1768920020.1925] device (tapdb46acd4-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:40:20 np0005588919 NetworkManager[49104]: <info>  [1768920020.1935] device (tapdb46acd4-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.199 225859 DEBUG nova.virt.libvirt.driver [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.199 225859 DEBUG nova.virt.libvirt.driver [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.199 225859 DEBUG nova.virt.libvirt.driver [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:3b:35:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.200 225859 DEBUG nova.virt.libvirt.driver [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:24:df:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.199 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bef15fdd-ff0f-427c-b842-aaef978e6fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.202 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f953f56c-5726-4e48-b3dd-b13438ab0689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.225 225859 DEBUG nova.virt.libvirt.guest [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:20</nova:creationTime>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:port uuid="db46acd4-809b-4127-ad48-870ae429b4d6">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.233 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8eadce-0c54-4324-b474-92960aa1cb23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.248 225859 DEBUG oslo_concurrency.lockutils [None req-bc0698b6-4f44-4581-89b2-1332a8a98461 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.249 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aec06468-7fdb-4a3f-8c9f-eeafa279ca7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 15488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252592, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.266 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[13309b0f-81bb-4914-86aa-6e25371b19fa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252593, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252593, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.268 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.270 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.271 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.272 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.272 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:20.273 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:40:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2413042588' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.306 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.331 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.335 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:40:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/564781149' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.772 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.774 225859 DEBUG nova.virt.libvirt.vif [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:40:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=68,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbOWirsAXlTIoevt4kXzpqBpapeg8X6KpUPmzDXnXlw7wqoLKHnmHfUIYL+FmHPJoWs+SV643EEJY+tqAkcrZlCPnWit4UcMgPhE0LGoYJ6xDnxZGwNzSj5VV503kGh5A==',key_name='tempest-keypair-1553012660',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b50ce2f25e8943e28ddf8bf69c721e75',ramdisk_id='',reservation_id='r-w26et9o4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-626473092',owner_user_name='tempest-ServersTestFqdnHostnames-626473092-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:40:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='16c05e1ac16f428bab6b36346856235e',uuid=a96ccadd-ac1d-4040-8bcc-bebb460ee233,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.775 225859 DEBUG nova.network.os_vif_util [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converting VIF {"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.776 225859 DEBUG nova.network.os_vif_util [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.777 225859 DEBUG nova.objects.instance [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid a96ccadd-ac1d-4040-8bcc-bebb460ee233 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.795 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <uuid>a96ccadd-ac1d-4040-8bcc-bebb460ee233</uuid>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <name>instance-00000044</name>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:name>guest-instance-1.domain.com</nova:name>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:40:19</nova:creationTime>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:user uuid="16c05e1ac16f428bab6b36346856235e">tempest-ServersTestFqdnHostnames-626473092-project-member</nova:user>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:project uuid="b50ce2f25e8943e28ddf8bf69c721e75">tempest-ServersTestFqdnHostnames-626473092</nova:project>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <nova:port uuid="19a89daa-770c-4c3f-970c-a9a462503b06">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="serial">a96ccadd-ac1d-4040-8bcc-bebb460ee233</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="uuid">a96ccadd-ac1d-4040-8bcc-bebb460ee233</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:7c:c6:66"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <target dev="tap19a89daa-77"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/console.log" append="off"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:40:20 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:40:20 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:40:20 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.797 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Preparing to wait for external event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.797 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.797 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.798 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.798 225859 DEBUG nova.virt.libvirt.vif [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:40:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=68,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbOWirsAXlTIoevt4kXzpqBpapeg8X6KpUPmzDXnXlw7wqoLKHnmHfUIYL+FmHPJoWs+SV643EEJY+tqAkcrZlCPnWit4UcMgPhE0LGoYJ6xDnxZGwNzSj5VV503kGh5A==',key_name='tempest-keypair-1553012660',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b50ce2f25e8943e28ddf8bf69c721e75',ramdisk_id='',reservation_id='r-w26et9o4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-626473092',owner_user_name='tempest-ServersTestFqdnHostnames-626473092-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:40:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='16c05e1ac16f428bab6b36346856235e',uuid=a96ccadd-ac1d-4040-8bcc-bebb460ee233,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.799 225859 DEBUG nova.network.os_vif_util [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converting VIF {"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.799 225859 DEBUG nova.network.os_vif_util [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.799 225859 DEBUG os_vif [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.800 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.800 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.801 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.803 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.804 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19a89daa-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.804 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap19a89daa-77, col_values=(('external_ids', {'iface-id': '19a89daa-770c-4c3f-970c-a9a462503b06', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:c6:66', 'vm-uuid': 'a96ccadd-ac1d-4040-8bcc-bebb460ee233'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.806 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.808 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:20 np0005588919 NetworkManager[49104]: <info>  [1768920020.8087] manager: (tap19a89daa-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.813 225859 INFO os_vif [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77')#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.893 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.894 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.894 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] No VIF found with MAC fa:16:3e:7c:c6:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.894 225859 INFO nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Using config drive#033[00m
Jan 20 09:40:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.923 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.976 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.976 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:20 np0005588919 nova_compute[225855]: 2026-01-20 14:40:20.977 225859 DEBUG nova.objects.instance [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:21.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:21Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:df:88 10.100.0.12
Jan 20 09:40:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:21Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:df:88 10.100.0.12
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.338 225859 INFO nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Creating config drive at /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.345 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnlmyqo5i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.421 225859 DEBUG nova.network.neutron [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated VIF entry in instance network info cache for port db46acd4-809b-4127-ad48-870ae429b4d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.421 225859 DEBUG nova.network.neutron [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.432 225859 DEBUG nova.objects.instance [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_requests' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.448 225859 DEBUG oslo_concurrency.lockutils [req-d7555cd0-1901-4197-b5d5-bed5a20edf60 req-68a01821-fc6f-463a-9132-16e9cd02bc48 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.450 225859 DEBUG nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.478 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnlmyqo5i" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.505 225859 DEBUG nova.storage.rbd_utils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] rbd image a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.510 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.533 225859 DEBUG nova.network.neutron [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updated VIF entry in instance network info cache for port 19a89daa-770c-4c3f-970c-a9a462503b06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.533 225859 DEBUG nova.network.neutron [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updating instance_info_cache with network_info: [{"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.552 225859 DEBUG oslo_concurrency.lockutils [req-9aa08c3b-df5d-455d-a6c7-b122e663321f req-3850f364-33e1-4a03-8ef0-db34ae5d7263 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.638 225859 DEBUG nova.policy [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.861 225859 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.862 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.862 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.862 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.862 225859 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.863 225859 WARNING nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.863 225859 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.863 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.866 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.866 225859 DEBUG oslo_concurrency.lockutils [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.867 225859 DEBUG nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:21 np0005588919 nova_compute[225855]: 2026-01-20 14:40:21.867 225859 WARNING nova.compute.manager [req-08521e48-03a6-4901-9492-78424ce68bbd req-4dac17cb-4548-4fa9-abc6-6a883c1f4e8a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:21.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.192 225859 DEBUG nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully created port: 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.225 225859 DEBUG oslo_concurrency.processutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config a96ccadd-ac1d-4040-8bcc-bebb460ee233_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.716s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.226 225859 INFO nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Deleting local config drive /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233/disk.config because it was imported into RBD.#033[00m
Jan 20 09:40:22 np0005588919 kernel: tap19a89daa-77: entered promiscuous mode
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.2876] manager: (tap19a89daa-77): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.3120] device (tap19a89daa-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:40:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:22Z|00203|binding|INFO|Claiming lport 19a89daa-770c-4c3f-970c-a9a462503b06 for this chassis.
Jan 20 09:40:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:22Z|00204|binding|INFO|19a89daa-770c-4c3f-970c-a9a462503b06: Claiming fa:16:3e:7c:c6:66 10.100.0.12
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.312 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.3148] device (tap19a89daa-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.318 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:c6:66 10.100.0.12'], port_security=['fa:16:3e:7c:c6:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a96ccadd-ac1d-4040-8bcc-bebb460ee233', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b50ce2f25e8943e28ddf8bf69c721e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b713641f-ba5a-4dd0-8917-baab1ca61007', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01ad0ee9-ef23-4072-99f7-406bf8559611, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=19a89daa-770c-4c3f-970c-a9a462503b06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.319 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 19a89daa-770c-4c3f-970c-a9a462503b06 in datapath 1002188b-c6a7-4b59-9326-3a1a837a00fd bound to our chassis#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.321 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1002188b-c6a7-4b59-9326-3a1a837a00fd#033[00m
Jan 20 09:40:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:22Z|00205|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 ovn-installed in OVS
Jan 20 09:40:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:22Z|00206|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 up in Southbound
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.331 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.331 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[68768ec4-5faf-45cc-8ef9-98e2e72ab28f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.331 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1002188b-c1 in ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.333 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1002188b-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.334 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cf763fe0-7daf-4ff8-9184-4227016539d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.334 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[75d3cae2-1f87-4e12-aba9-6a35c4b1dc37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:22 np0005588919 systemd-machined[194361]: New machine qemu-30-instance-00000044.
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.352 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[fffa861e-2f6f-4b7d-8b03-98460518d36d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 systemd[1]: Started Virtual Machine qemu-30-instance-00000044.
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.381 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f20c6513-3daa-4c5f-808d-10655ee99497]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.412 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cc856307-acf6-4afb-a125-b8e7bf691c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.4192] manager: (tap1002188b-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.418 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcfe5a6-97c4-4769-8873-e3a719febdd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.448 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f72cf4-db2f-4c15-bbce-8d8f4d74b2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.452 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b2de6250-bf99-46a9-96eb-b9bc334db59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.4747] device (tap1002188b-c0): carrier: link connected
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.480 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c8fec8-9633-46e3-a49c-da00c34f27ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.494 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[07dfeb92-1cf1-4002-b9bf-f6a409f6be4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1002188b-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:44:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506140, 'reachable_time': 26689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252741, 'error': None, 'target': 'ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.511 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[77715749-e12e-46d7-8b4e-67cdd88dc28c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:443e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506140, 'tstamp': 506140}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252742, 'error': None, 'target': 'ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.528 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe43366-cbb2-4664-8c27-5ac23e68341e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1002188b-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:44:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506140, 'reachable_time': 26689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252743, 'error': None, 'target': 'ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.564 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78cf3aa7-8ad8-4ab0-8f48-0a30c5e16824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.631 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c62c9dd0-3031-4bec-90ea-550fd2664d44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.632 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1002188b-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.633 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.633 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1002188b-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.633 225859 DEBUG nova.compute.manager [req-22bed124-98d5-413c-b58b-10aefd809095 req-6e81b693-7275-4900-8236-aec7c1c2a0e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.633 225859 DEBUG oslo_concurrency.lockutils [req-22bed124-98d5-413c-b58b-10aefd809095 req-6e81b693-7275-4900-8236-aec7c1c2a0e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.633 225859 DEBUG oslo_concurrency.lockutils [req-22bed124-98d5-413c-b58b-10aefd809095 req-6e81b693-7275-4900-8236-aec7c1c2a0e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.634 225859 DEBUG oslo_concurrency.lockutils [req-22bed124-98d5-413c-b58b-10aefd809095 req-6e81b693-7275-4900-8236-aec7c1c2a0e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.634 225859 DEBUG nova.compute.manager [req-22bed124-98d5-413c-b58b-10aefd809095 req-6e81b693-7275-4900-8236-aec7c1c2a0e2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Processing event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 kernel: tap1002188b-c0: entered promiscuous mode
Jan 20 09:40:22 np0005588919 NetworkManager[49104]: <info>  [1768920022.6354] manager: (tap1002188b-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.639 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1002188b-c0, col_values=(('external_ids', {'iface-id': '8cff17f5-b792-4e6f-8f1e-6c48322af961'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:22Z|00207|binding|INFO|Releasing lport 8cff17f5-b792-4e6f-8f1e-6c48322af961 from this chassis (sb_readonly=0)
Jan 20 09:40:22 np0005588919 nova_compute[225855]: 2026-01-20 14:40:22.656 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.658 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1002188b-c6a7-4b59-9326-3a1a837a00fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1002188b-c6a7-4b59-9326-3a1a837a00fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.659 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[03c88e3e-8394-44ba-a481-46d61cee9f38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.659 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-1002188b-c6a7-4b59-9326-3a1a837a00fd
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/1002188b-c6a7-4b59-9326-3a1a837a00fd.pid.haproxy
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 1002188b-c6a7-4b59-9326-3a1a837a00fd
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:40:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:22.660 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'env', 'PROCESS_TAG=haproxy-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1002188b-c6a7-4b59-9326-3a1a837a00fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:40:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:23.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.060 225859 DEBUG nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully updated port: 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.074 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.074 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.074 225859 DEBUG nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:40:23 np0005588919 podman[252800]: 2026-01-20 14:40:22.987161504 +0000 UTC m=+0.021448409 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:40:23 np0005588919 podman[252800]: 2026-01-20 14:40:23.092338132 +0000 UTC m=+0.126625037 container create 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:40:23 np0005588919 systemd[1]: Started libpod-conmon-38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b.scope.
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.159 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.162 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920023.1591918, a96ccadd-ac1d-4040-8bcc-bebb460ee233 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.163 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] VM Started (Lifecycle Event)#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.167 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.171 225859 INFO nova.virt.libvirt.driver [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Instance spawned successfully.#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.172 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:40:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:40:23 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc585d326ff1df1a671eac1b099fad631006b99f0da45b11e11943fba074a4f1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.190 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.196 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.200 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.200 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.201 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.201 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.201 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.202 225859 DEBUG nova.virt.libvirt.driver [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:40:23 np0005588919 podman[252800]: 2026-01-20 14:40:23.214015378 +0000 UTC m=+0.248302373 container init 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.215 225859 WARNING nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.216 225859 WARNING nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:23 np0005588919 podman[252800]: 2026-01-20 14:40:23.219299748 +0000 UTC m=+0.253586643 container start 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.227 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.227 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920023.1603715, a96ccadd-ac1d-4040-8bcc-bebb460ee233 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.228 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:40:23 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [NOTICE]   (252836) : New worker (252838) forked
Jan 20 09:40:23 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [NOTICE]   (252836) : Loading success.
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.251 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.255 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920023.1641107, a96ccadd-ac1d-4040-8bcc-bebb460ee233 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.255 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.259 225859 INFO nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Took 7.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.259 225859 DEBUG nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.269 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.273 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.303 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.323 225859 INFO nova.compute.manager [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Took 8.52 seconds to build instance.#033[00m
Jan 20 09:40:23 np0005588919 nova_compute[225855]: 2026-01-20 14:40:23.338 225859 DEBUG oslo_concurrency.lockutils [None req-30feda58-4032-4227-b08f-128214a95310 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:23.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:25.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.146 225859 DEBUG nova.compute.manager [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.147 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.148 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.149 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.149 225859 DEBUG nova.compute.manager [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] No waiting events found dispatching network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.151 225859 WARNING nova.compute.manager [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received unexpected event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.152 225859 DEBUG nova.compute.manager [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-changed-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.153 225859 DEBUG nova.compute.manager [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing instance network info cache due to event network-changed-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.154 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:25 np0005588919 nova_compute[225855]: 2026-01-20 14:40:25.847 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:25.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.346 225859 DEBUG nova.compute.manager [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-changed-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.347 225859 DEBUG nova.compute.manager [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Refreshing instance network info cache due to event network-changed-19a89daa-770c-4c3f-970c-a9a462503b06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.347 225859 DEBUG oslo_concurrency.lockutils [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.347 225859 DEBUG oslo_concurrency.lockutils [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:26 np0005588919 nova_compute[225855]: 2026-01-20 14:40:26.348 225859 DEBUG nova.network.neutron [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Refreshing network info cache for port 19a89daa-770c-4c3f-970c-a9a462503b06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:40:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:27.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.375 225859 DEBUG nova.network.neutron [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.394 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.395 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.396 225859 DEBUG nova.network.neutron [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing network info cache for port 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.399 225859 DEBUG nova.virt.libvirt.vif [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.399 225859 DEBUG nova.network.os_vif_util [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.400 225859 DEBUG nova.network.os_vif_util [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.401 225859 DEBUG os_vif [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.402 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.403 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.406 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.407 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fd7b3ad-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.407 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5fd7b3ad-0c, col_values=(('external_ids', {'iface-id': '5fd7b3ad-0cf1-4294-b552-6141c8ee85bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:51:f5', 'vm-uuid': 'd5c2df9d-748f-4df2-9392-b45741975f65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.408 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 NetworkManager[49104]: <info>  [1768920027.4096] manager: (tap5fd7b3ad-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.415 225859 INFO os_vif [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c')#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.416 225859 DEBUG nova.virt.libvirt.vif [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.417 225859 DEBUG nova.network.os_vif_util [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.417 225859 DEBUG nova.network.os_vif_util [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.420 225859 DEBUG nova.virt.libvirt.guest [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:d2:51:f5"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <target dev="tap5fd7b3ad-0c"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:27 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:40:27 np0005588919 kernel: tap5fd7b3ad-0c: entered promiscuous mode
Jan 20 09:40:27 np0005588919 NetworkManager[49104]: <info>  [1768920027.4311] manager: (tap5fd7b3ad-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Jan 20 09:40:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:27Z|00208|binding|INFO|Claiming lport 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd for this chassis.
Jan 20 09:40:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:27Z|00209|binding|INFO|5fd7b3ad-0cf1-4294-b552-6141c8ee85bd: Claiming fa:16:3e:d2:51:f5 10.100.0.11
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.433 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.440 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:51:f5 10.100.0.11'], port_security=['fa:16:3e:d2:51:f5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.442 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.444 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:27Z|00210|binding|INFO|Setting lport 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd ovn-installed in OVS
Jan 20 09:40:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:27Z|00211|binding|INFO|Setting lport 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd up in Southbound
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.454 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.456 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 systemd-udevd[252907]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.465 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[63f2a438-816c-4917-954f-db4318e013e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 NetworkManager[49104]: <info>  [1768920027.4793] device (tap5fd7b3ad-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:40:27 np0005588919 NetworkManager[49104]: <info>  [1768920027.4798] device (tap5fd7b3ad-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.496 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a841517a-18da-4bb1-a06e-1f4f01d6e35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.505 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[12a52736-1dbd-4237-a30d-3555ec5398b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.532 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0c193ff8-e27d-4bc4-8b5f-c2bbeac0dd09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.550 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[79d8e360-40f1-42e4-a31a-c04125a43466]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 15488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252914, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.569 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf84414-535c-494a-8733-aa65806d727e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252916, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252916, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.571 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.572 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.575 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.576 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.576 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:27.577 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.653 225859 DEBUG nova.virt.libvirt.driver [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.653 225859 DEBUG nova.virt.libvirt.driver [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.654 225859 DEBUG nova.virt.libvirt.driver [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:3b:35:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.654 225859 DEBUG nova.virt.libvirt.driver [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:24:df:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.654 225859 DEBUG nova.virt.libvirt.driver [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:d2:51:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.687 225859 DEBUG nova.virt.libvirt.guest [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:27</nova:creationTime>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:port uuid="db46acd4-809b-4127-ad48-870ae429b4d6">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:27 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:27 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:27 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:27 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:27 np0005588919 nova_compute[225855]: 2026-01-20 14:40:27.711 225859 DEBUG oslo_concurrency.lockutils [None req-de3a07a7-560e-4eaf-8d3f-454117b11be7 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:27.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:28 np0005588919 nova_compute[225855]: 2026-01-20 14:40:28.321 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:29.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:29Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d2:51:f5 10.100.0.11
Jan 20 09:40:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:29Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d2:51:f5 10.100.0.11
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.471 225859 DEBUG nova.compute.manager [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.472 225859 DEBUG oslo_concurrency.lockutils [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.472 225859 DEBUG oslo_concurrency.lockutils [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.472 225859 DEBUG oslo_concurrency.lockutils [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.472 225859 DEBUG nova.compute.manager [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:29 np0005588919 nova_compute[225855]: 2026-01-20 14:40:29.473 225859 WARNING nova.compute.manager [req-0ab3b4c9-b364-4cb4-a72c-283ae197b3b0 req-7957c781-b55e-44a2-ae5d-021a274452f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:29.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.084 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:30 np0005588919 podman[252942]: 2026-01-20 14:40:30.174603273 +0000 UTC m=+0.055269277 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.192 225859 DEBUG nova.network.neutron [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated VIF entry in instance network info cache for port 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.193 225859 DEBUG nova.network.neutron [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.225 225859 DEBUG oslo_concurrency.lockutils [req-0099f7ca-40af-4b82-8d82-e1f9572f203a req-ff92b3ce-8adf-4275-a7bf-0c706845dd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.225 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.226 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.226 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.243 225859 DEBUG nova.network.neutron [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updated VIF entry in instance network info cache for port 19a89daa-770c-4c3f-970c-a9a462503b06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.244 225859 DEBUG nova.network.neutron [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updating instance_info_cache with network_info: [{"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:30 np0005588919 nova_compute[225855]: 2026-01-20 14:40:30.270 225859 DEBUG oslo_concurrency.lockutils [req-384dc47d-bdd2-4048-ba5c-cefdac094c99 req-03abf074-ef8e-429f-80fc-ad9e57739eb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a96ccadd-ac1d-4040-8bcc-bebb460ee233" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:31.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.631 225859 DEBUG nova.compute.manager [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.633 225859 DEBUG oslo_concurrency.lockutils [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.633 225859 DEBUG oslo_concurrency.lockutils [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.633 225859 DEBUG oslo_concurrency.lockutils [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.634 225859 DEBUG nova.compute.manager [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:31 np0005588919 nova_compute[225855]: 2026-01-20 14:40:31.634 225859 WARNING nova.compute.manager [req-c138b668-5d51-41c2-a531-a4cfdd15618a req-f680e24f-c065-48bb-b7c4-ef0fb9ff6c94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:40:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:40:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:40:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:31.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.107 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-0b83ac2a-727a-4db9-91f2-69f939deeb69" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.108 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-0b83ac2a-727a-4db9-91f2-69f939deeb69" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.108 225859 DEBUG nova.objects.instance [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.411 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.643 225859 DEBUG nova.objects.instance [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_requests' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:32 np0005588919 nova_compute[225855]: 2026-01-20 14:40:32.659 225859 DEBUG nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:40:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:33.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:33 np0005588919 nova_compute[225855]: 2026-01-20 14:40:33.247 225859 DEBUG nova.policy [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:40:33 np0005588919 nova_compute[225855]: 2026-01-20 14:40:33.958 225859 DEBUG nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Successfully updated port: 0b83ac2a-727a-4db9-91f2-69f939deeb69 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:40:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:33.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.002 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.056 225859 DEBUG nova.compute.manager [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-changed-0b83ac2a-727a-4db9-91f2-69f939deeb69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.057 225859 DEBUG nova.compute.manager [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing instance network info cache due to event network-changed-0b83ac2a-727a-4db9-91f2-69f939deeb69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.057 225859 DEBUG oslo_concurrency.lockutils [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.342 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.364 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.365 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.365 225859 DEBUG nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.366 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.367 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.367 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.367 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.368 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.368 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.388 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.388 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.514 225859 WARNING nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.516 225859 WARNING nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.516 225859 WARNING nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:40:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/164278318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.860 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.936 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.937 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.941 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000044 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:40:34 np0005588919 nova_compute[225855]: 2026-01-20 14:40:34.941 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000044 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:40:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:40:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:35.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.086 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.124 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.126 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4149MB free_disk=20.8763427734375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.126 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.127 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.206 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance d5c2df9d-748f-4df2-9392-b45741975f65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.206 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance a96ccadd-ac1d-4040-8bcc-bebb460ee233 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.206 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.207 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.350 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2698970955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.820 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.826 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.842 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.866 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:40:35 np0005588919 nova_compute[225855]: 2026-01-20 14:40:35.866 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:35.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:37.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:37 np0005588919 nova_compute[225855]: 2026-01-20 14:40:37.424 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:37Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:c6:66 10.100.0.12
Jan 20 09:40:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:37Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:c6:66 10.100.0.12
Jan 20 09:40:37 np0005588919 nova_compute[225855]: 2026-01-20 14:40:37.855 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:37.855 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:37.856 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:40:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:37.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:39.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.889 225859 DEBUG nova.network.neutron [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.909 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.910 225859 DEBUG oslo_concurrency.lockutils [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.911 225859 DEBUG nova.network.neutron [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Refreshing network info cache for port 0b83ac2a-727a-4db9-91f2-69f939deeb69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.914 225859 DEBUG nova.virt.libvirt.vif [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.914 225859 DEBUG nova.network.os_vif_util [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.915 225859 DEBUG nova.network.os_vif_util [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.915 225859 DEBUG os_vif [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.916 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.916 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.917 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.919 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b83ac2a-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.919 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b83ac2a-72, col_values=(('external_ids', {'iface-id': '0b83ac2a-727a-4db9-91f2-69f939deeb69', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:0e:b2', 'vm-uuid': 'd5c2df9d-748f-4df2-9392-b45741975f65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.921 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 NetworkManager[49104]: <info>  [1768920039.9221] manager: (tap0b83ac2a-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.937 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.941 225859 INFO os_vif [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72')#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.942 225859 DEBUG nova.virt.libvirt.vif [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.942 225859 DEBUG nova.network.os_vif_util [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.943 225859 DEBUG nova.network.os_vif_util [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.945 225859 DEBUG nova.virt.libvirt.guest [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:40:39 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:e7:0e:b2"/>
Jan 20 09:40:39 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:39 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:39 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:39 np0005588919 nova_compute[225855]:  <target dev="tap0b83ac2a-72"/>
Jan 20 09:40:39 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:39 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:40:39 np0005588919 kernel: tap0b83ac2a-72: entered promiscuous mode
Jan 20 09:40:39 np0005588919 NetworkManager[49104]: <info>  [1768920039.9546] manager: (tap0b83ac2a-72): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.957 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:39Z|00212|binding|INFO|Claiming lport 0b83ac2a-727a-4db9-91f2-69f939deeb69 for this chassis.
Jan 20 09:40:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:39Z|00213|binding|INFO|0b83ac2a-727a-4db9-91f2-69f939deeb69: Claiming fa:16:3e:e7:0e:b2 10.100.0.4
Jan 20 09:40:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:39.965 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:0e:b2 10.100.0.4'], port_security=['fa:16:3e:e7:0e:b2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2013995474', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2013995474', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=0b83ac2a-727a-4db9-91f2-69f939deeb69) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:39.967 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 0b83ac2a-727a-4db9-91f2-69f939deeb69 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:40:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:39.969 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:39Z|00214|binding|INFO|Setting lport 0b83ac2a-727a-4db9-91f2-69f939deeb69 ovn-installed in OVS
Jan 20 09:40:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:39Z|00215|binding|INFO|Setting lport 0b83ac2a-727a-4db9-91f2-69f939deeb69 up in Southbound
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 systemd-udevd[253176]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:40:39 np0005588919 nova_compute[225855]: 2026-01-20 14:40:39.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:39.986 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[efe96add-e1c0-4e02-951b-9931c2090042]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:39 np0005588919 NetworkManager[49104]: <info>  [1768920039.9990] device (tap0b83ac2a-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:40:39 np0005588919 NetworkManager[49104]: <info>  [1768920039.9998] device (tap0b83ac2a-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:40:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:40.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.012 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e928b5f4-93b9-4370-ad33-474b13163ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.015 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c3408f-b408-4171-9f40-fd345a07cd6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.028 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.029 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.029 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:3b:35:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.029 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:24:df:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.029 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:d2:51:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.029 225859 DEBUG nova.virt.libvirt.driver [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:e7:0e:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.044 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ced0e286-95d2-4f45-ba1a-457ba2ee96a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.054 225859 DEBUG nova.virt.libvirt.guest [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:40</nova:creationTime>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:port uuid="db46acd4-809b-4127-ad48-870ae429b4d6">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:40 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:40 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:40 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.061 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d24fb849-2302-4a35-9fd0-eaa44c8a6988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 33912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253183, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.077 225859 DEBUG oslo_concurrency.lockutils [None req-b2246b1c-96ec-4ece-a914-b82c394b5339 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-0b83ac2a-727a-4db9-91f2-69f939deeb69" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.077 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8aff4e-d663-4f15-a247-25274c4fad5f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253184, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253184, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:40Z|00216|binding|INFO|Releasing lport 8cff17f5-b792-4e6f-8f1e-6c48322af961 from this chassis (sb_readonly=0)
Jan 20 09:40:40 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:40Z|00217|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.079 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.081 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.082 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.082 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.083 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:40.083 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.147 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.301 225859 DEBUG nova.compute.manager [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.301 225859 DEBUG oslo_concurrency.lockutils [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.302 225859 DEBUG oslo_concurrency.lockutils [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.302 225859 DEBUG oslo_concurrency.lockutils [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.302 225859 DEBUG nova.compute.manager [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.302 225859 WARNING nova.compute.manager [req-48374b3d-1394-4b47-9446-d588c4bacf94 req-5766119b-4130-4a62-abb1-ca73edb8064e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.862 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:40 np0005588919 nova_compute[225855]: 2026-01-20 14:40:40.863 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:40:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:41.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.375 225859 DEBUG nova.network.neutron [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updated VIF entry in instance network info cache for port 0b83ac2a-727a-4db9-91f2-69f939deeb69. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.376 225859 DEBUG nova.network.neutron [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.393 225859 DEBUG oslo_concurrency.lockutils [req-29ab351e-d7f1-45a4-be9d-33e98aaab4ff req-46720557-2704-4b58-bf50-c27aa2c9e14d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.982 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-db46acd4-809b-4127-ad48-870ae429b4d6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.983 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-db46acd4-809b-4127-ad48-870ae429b4d6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:41 np0005588919 nova_compute[225855]: 2026-01-20 14:40:41.999 225859 DEBUG nova.objects.instance [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:42.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.016 225859 DEBUG nova.virt.libvirt.vif [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.017 225859 DEBUG nova.network.os_vif_util [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.017 225859 DEBUG nova.network.os_vif_util [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.020 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.022 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.025 225859 DEBUG nova.virt.libvirt.driver [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Attempting to detach device tapdb46acd4-80 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.025 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:24:df:88"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <target dev="tapdb46acd4-80"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.033 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.037 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface>not found in domain: <domain type='kvm' id='29'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <name>instance-00000042</name>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <uuid>d5c2df9d-748f-4df2-9392-b45741975f65</uuid>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:40</nova:creationTime>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="db46acd4-809b-4127-ad48-870ae429b4d6">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <resource>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </resource>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='serial'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='uuid'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk' index='2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk.config' index='1'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:3b:35:f2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tapb48170b0-71'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:24:df:88'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tapdb46acd4-80'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:d2:51:f5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tap5fd7b3ad-0c'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:e7:0e:b2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tap0b83ac2a-72'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c277,c849</label>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c849</imagelabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.039 225859 INFO nova.virt.libvirt.driver [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tapdb46acd4-80 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the persistent domain config.#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.039 225859 DEBUG nova.virt.libvirt.driver [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] (1/8): Attempting to detach device tapdb46acd4-80 with device alias net1 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.040 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:24:df:88"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <target dev="tapdb46acd4-80"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:40:42 np0005588919 kernel: tapdb46acd4-80 (unregistering): left promiscuous mode
Jan 20 09:40:42 np0005588919 NetworkManager[49104]: <info>  [1768920042.1373] device (tapdb46acd4-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.148 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:42Z|00218|binding|INFO|Releasing lport db46acd4-809b-4127-ad48-870ae429b4d6 from this chassis (sb_readonly=0)
Jan 20 09:40:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:42Z|00219|binding|INFO|Setting lport db46acd4-809b-4127-ad48-870ae429b4d6 down in Southbound
Jan 20 09:40:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:42Z|00220|binding|INFO|Removing iface tapdb46acd4-80 ovn-installed in OVS
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.150 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.154 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:df:88 10.100.0.12'], port_security=['fa:16:3e:24:df:88 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=db46acd4-809b-4127-ad48-870ae429b4d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.155 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768920042.1547081, d5c2df9d-748f-4df2-9392-b45741975f65 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.155 140354 INFO neutron.agent.ovn.metadata.agent [-] Port db46acd4-809b-4127-ad48-870ae429b4d6 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.156 225859 DEBUG nova.virt.libvirt.driver [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Start waiting for the detach event from libvirt for device tapdb46acd4-80 with device alias net1 for instance d5c2df9d-748f-4df2-9392-b45741975f65 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.156 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.156 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.159 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface>not found in domain: <domain type='kvm' id='29'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <name>instance-00000042</name>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <uuid>d5c2df9d-748f-4df2-9392-b45741975f65</uuid>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:40</nova:creationTime>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="db46acd4-809b-4127-ad48-870ae429b4d6">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <resource>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </resource>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='serial'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='uuid'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk' index='2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk.config' index='1'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:3b:35:f2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tapb48170b0-71'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:d2:51:f5'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tap5fd7b3ad-0c'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:e7:0e:b2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target dev='tap0b83ac2a-72'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='net3'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c277,c849</label>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c849</imagelabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.159 225859 INFO nova.virt.libvirt.driver [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tapdb46acd4-80 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the live domain config.#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.160 225859 DEBUG nova.virt.libvirt.vif [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.160 225859 DEBUG nova.network.os_vif_util [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.161 225859 DEBUG nova.network.os_vif_util [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.161 225859 DEBUG os_vif [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.163 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb46acd4-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.169 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[72172048-83ef-4567-b69b-90aac239c4e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.194 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[06cf3e1e-638f-404c-99e3-31f116ef312b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.197 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[616db67d-c1c6-4609-a14c-8b63376f74b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.220 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.223 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.223 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a61b2160-7d30-43c7-9eed-8764b4bde2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.225 225859 INFO os_vif [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80')#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.226 225859 DEBUG nova.virt.libvirt.guest [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:42</nova:creationTime>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:42 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:42 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:42 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.240 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc58e68-296e-4057-868b-ad5e3a2a1aa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 33912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253195, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.257 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[38b10a69-8859-4bf3-8d0c-657437599968]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253196, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253196, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.258 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.260 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.261 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.261 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.261 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:42.261 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.431 225859 DEBUG nova.compute.manager [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.431 225859 DEBUG oslo_concurrency.lockutils [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.431 225859 DEBUG oslo_concurrency.lockutils [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.432 225859 DEBUG oslo_concurrency.lockutils [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.432 225859 DEBUG nova.compute.manager [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.432 225859 WARNING nova.compute.manager [req-6f5a0e58-8463-4cf2-8953-12046cbdd9a3 req-11ad0333-3d8a-444d-8987-4e7bcc210705 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-0b83ac2a-727a-4db9-91f2-69f939deeb69 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.510 225859 DEBUG nova.compute.manager [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-unplugged-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.510 225859 DEBUG oslo_concurrency.lockutils [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.510 225859 DEBUG oslo_concurrency.lockutils [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.511 225859 DEBUG oslo_concurrency.lockutils [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.511 225859 DEBUG nova.compute.manager [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-unplugged-db46acd4-809b-4127-ad48-870ae429b4d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.511 225859 WARNING nova.compute.manager [req-e3e8713f-0164-48d9-9883-bedfb70c8bf4 req-f4548433-ef47-464c-a31c-fa2e79369e4f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-unplugged-db46acd4-809b-4127-ad48-870ae429b4d6 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:40:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:42Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:0e:b2 10.100.0.4
Jan 20 09:40:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:42Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:0e:b2 10.100.0.4
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.973 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.974 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:40:42 np0005588919 nova_compute[225855]: 2026-01-20 14:40:42.974 225859 DEBUG nova.network.neutron [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:40:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:43.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:44.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00221|binding|INFO|Releasing lport 8cff17f5-b792-4e6f-8f1e-6c48322af961 from this chassis (sb_readonly=0)
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00222|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.087 140354 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 1972e093-3e75-4ae7-adfa-a5698a2d1439 with type ""#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.088 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:0e:b2 10.100.0.4'], port_security=['fa:16:3e:e7:0e:b2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2013995474', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2013995474', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=0b83ac2a-727a-4db9-91f2-69f939deeb69) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.090 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 0b83ac2a-727a-4db9-91f2-69f939deeb69 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.091 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00223|binding|INFO|Removing iface tap0b83ac2a-72 ovn-installed in OVS
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00224|binding|INFO|Removing lport 0b83ac2a-727a-4db9-91f2-69f939deeb69 ovn-installed in OVS
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.097 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.106 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[318cf7f3-021f-4d55-aa38-5f2b9a0e4e43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.134 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.136 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[74733991-44b3-435e-8a91-efa6f207ca0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.140 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[48e1118c-8adc-4231-af6c-0107db2eab32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.167 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[67420da7-9e2e-4559-8644-0bbc2a0b5285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.183 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6f6c82-eaa8-4980-bbdf-629c817ecc90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 33912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253203, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.203 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1719610c-416d-4538-b496-be32681ad7d6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253204, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253204, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.208 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.208 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.208 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.209 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.209 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.359 225859 INFO nova.network.neutron [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Port db46acd4-809b-4127-ad48-870ae429b4d6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.495 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.495 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.496 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.496 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.496 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.497 225859 INFO nova.compute.manager [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Terminating instance#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.498 225859 DEBUG nova.compute.manager [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:40:44 np0005588919 kernel: tapb48170b0-71 (unregistering): left promiscuous mode
Jan 20 09:40:44 np0005588919 NetworkManager[49104]: <info>  [1768920044.6108] device (tapb48170b0-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00225|binding|INFO|Releasing lport b48170b0-717d-48f0-8172-742a4a8596e9 from this chassis (sb_readonly=0)
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00226|binding|INFO|Setting lport b48170b0-717d-48f0-8172-742a4a8596e9 down in Southbound
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00227|binding|INFO|Removing iface tapb48170b0-71 ovn-installed in OVS
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.627 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:35:f2 10.100.0.13'], port_security=['fa:16:3e:3b:35:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b03ad0a9-4e4a-464d-b7d2-84d77d6554bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.174'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=b48170b0-717d-48f0-8172-742a4a8596e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.629 140354 INFO neutron.agent.ovn.metadata.agent [-] Port b48170b0-717d-48f0-8172-742a4a8596e9 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.630 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 kernel: tap5fd7b3ad-0c (unregistering): left promiscuous mode
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.663 225859 DEBUG nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.663 225859 DEBUG oslo_concurrency.lockutils [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.664 225859 DEBUG oslo_concurrency.lockutils [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.664 225859 DEBUG oslo_concurrency.lockutils [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.664 225859 DEBUG nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.664 225859 WARNING nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-db46acd4-809b-4127-ad48-870ae429b4d6 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.664 225859 DEBUG nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-deleted-db46acd4-809b-4127-ad48-870ae429b4d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.665 225859 INFO nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Neutron deleted interface db46acd4-809b-4127-ad48-870ae429b4d6; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.665 225859 DEBUG nova.network.neutron [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:44 np0005588919 NetworkManager[49104]: <info>  [1768920044.6721] device (tap5fd7b3ad-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.676 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[13750d84-a2d8-4312-832b-a46abc3623ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00228|binding|INFO|Releasing lport 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd from this chassis (sb_readonly=0)
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00229|binding|INFO|Setting lport 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd down in Southbound
Jan 20 09:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:44Z|00230|binding|INFO|Removing iface tap5fd7b3ad-0c ovn-installed in OVS
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.681 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.682 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.686 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:51:f5 10.100.0.11'], port_security=['fa:16:3e:d2:51:f5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd5c2df9d-748f-4df2-9392-b45741975f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:44 np0005588919 kernel: tap0b83ac2a-72 (unregistering): left promiscuous mode
Jan 20 09:40:44 np0005588919 NetworkManager[49104]: <info>  [1768920044.7161] device (tap0b83ac2a-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.726 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[875c32c9-fabf-492d-8d08-e2a364f3f5e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.730 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd593e8-b040-4ef7-84be-3b2823eeaf08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.757 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1090b53d-8bed-4b93-b55a-6bb99baeb940]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000042.scope: Deactivated successfully.
Jan 20 09:40:44 np0005588919 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000042.scope: Consumed 14.696s CPU time.
Jan 20 09:40:44 np0005588919 systemd-machined[194361]: Machine qemu-29-instance-00000042 terminated.
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.774 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd5c159-a6e1-41d9-86ae-89d2b880aa9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503233, 'reachable_time': 33912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253226, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.788 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15688527-2c39-469a-9d57-e3c603bd582d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503245, 'tstamp': 503245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253227, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503248, 'tstamp': 503248}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253227, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.791 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.800 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.801 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.801 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.802 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.802 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.803 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5fd7b3ad-0cf1-4294-b552-6141c8ee85bd in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.804 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc21b99b-4e34-422c-be05-0a440009dac4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.805 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b127179-89a5-43f5-a74a-3adf294b6282]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.806 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace which is not needed anymore#033[00m
Jan 20 09:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:44.858 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.858 225859 DEBUG nova.compute.manager [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-unplugged-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.859 225859 DEBUG oslo_concurrency.lockutils [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.859 225859 DEBUG oslo_concurrency.lockutils [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.859 225859 DEBUG oslo_concurrency.lockutils [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.860 225859 DEBUG nova.compute.manager [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-unplugged-b48170b0-717d-48f0-8172-742a4a8596e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.860 225859 DEBUG nova.compute.manager [req-be15511e-516b-4047-a674-2f1bbd3b13fb req-c48c10b9-9ba9-480b-86b6-d29f87b7542d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-unplugged-b48170b0-717d-48f0-8172-742a4a8596e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.906 225859 DEBUG nova.objects.instance [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'system_metadata' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:44 np0005588919 NetworkManager[49104]: <info>  [1768920044.9279] manager: (tap5fd7b3ad-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Jan 20 09:40:44 np0005588919 NetworkManager[49104]: <info>  [1768920044.9432] manager: (tap0b83ac2a-72): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.945 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.952 225859 DEBUG nova.objects.instance [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'flavor' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.964 225859 INFO nova.virt.libvirt.driver [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Instance destroyed successfully.#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.965 225859 DEBUG nova.objects.instance [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'resources' on Instance uuid d5c2df9d-748f-4df2-9392-b45741975f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:44 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [NOTICE]   (252249) : haproxy version is 2.8.14-c23fe91
Jan 20 09:40:44 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [NOTICE]   (252249) : path to executable is /usr/sbin/haproxy
Jan 20 09:40:44 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [WARNING]  (252249) : Exiting Master process...
Jan 20 09:40:44 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [ALERT]    (252249) : Current worker (252251) exited with code 143 (Terminated)
Jan 20 09:40:44 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[252245]: [WARNING]  (252249) : All workers exited. Exiting... (0)
Jan 20 09:40:44 np0005588919 systemd[1]: libpod-ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48.scope: Deactivated successfully.
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.972 225859 DEBUG nova.virt.libvirt.vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:40:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.972 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.973 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.975 225859 DEBUG nova.virt.libvirt.vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.975 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.976 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:44 np0005588919 podman[253246]: 2026-01-20 14:40:44.977273238 +0000 UTC m=+0.066478414 container died ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.977 225859 DEBUG os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.979 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.979 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb48170b0-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.982 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.984 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.992 225859 INFO os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:35:f2,bridge_name='br-int',has_traffic_filtering=True,id=b48170b0-717d-48f0-8172-742a4a8596e9,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb48170b0-71')#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.993 225859 DEBUG nova.virt.libvirt.vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.993 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.994 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.994 225859 DEBUG os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:44 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.996 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:df:88"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb46acd4-80"/></interface>not found in domain: <domain type='kvm'>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <name>instance-00000042</name>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <uuid>d5c2df9d-748f-4df2-9392-b45741975f65</uuid>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:39:50</nova:creationTime>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:44 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='serial'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='uuid'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='partial'>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <model fallback='allow'>Nehalem</model>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:40:44 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk.config'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:3b:35:f2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='tapb48170b0-71'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:d2:51:f5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='tap5fd7b3ad-0c'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:e7:0e:b2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='tap0b83ac2a-72'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <console type='pty'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.997 225859 WARNING nova.virt.libvirt.driver [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Detaching interface fa:16:3e:24:df:88 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapdb46acd4-80' not found.#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.998 225859 DEBUG nova.virt.libvirt.vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:40:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.998 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "db46acd4-809b-4127-ad48-870ae429b4d6", "address": "fa:16:3e:24:df:88", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb46acd4-80", "ovs_interfaceid": "db46acd4-809b-4127-ad48-870ae429b4d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.998 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:44.999 225859 DEBUG os_vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.000 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.000 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb46acd4-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.000 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.001 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb46acd4-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.002 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.002 225859 INFO os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80')#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.003 225859 DEBUG nova.virt.libvirt.vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.003 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.003 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.003 225859 DEBUG os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.004 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.005 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fd7b3ad-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.005 225859 INFO os_vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:df:88,bridge_name='br-int',has_traffic_filtering=True,id=db46acd4-809b-4127-ad48-870ae429b4d6,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb46acd4-80')#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.006 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:45</nova:creationTime>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.011 225859 DEBUG nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-deleted-0b83ac2a-727a-4db9-91f2-69f939deeb69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.011 225859 INFO nova.compute.manager [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Neutron deleted interface 0b83ac2a-727a-4db9-91f2-69f939deeb69; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.012 225859 DEBUG nova.network.neutron [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:45 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48-userdata-shm.mount: Deactivated successfully.
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.013 225859 INFO os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:51:f5,bridge_name='br-int',has_traffic_filtering=True,id=5fd7b3ad-0cf1-4294-b552-6141c8ee85bd,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5fd7b3ad-0c')#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.014 225859 DEBUG nova.virt.libvirt.vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:39:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.014 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.015 225859 DEBUG nova.network.os_vif_util [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.015 225859 DEBUG os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:45 np0005588919 systemd[1]: var-lib-containers-storage-overlay-1daf9cb422d92ee82699c7df2fb191ba24031863f4d9fd4cf2bcb41f474f37a6-merged.mount: Deactivated successfully.
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.016 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b83ac2a-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.020 225859 INFO os_vif [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72')#033[00m
Jan 20 09:40:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:45.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.040 225859 DEBUG nova.virt.libvirt.vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:40:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.041 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.041 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.044 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:0e:b2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b83ac2a-72"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.047 225859 DEBUG nova.virt.libvirt.driver [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Attempting to detach device tap0b83ac2a-72 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.047 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:e7:0e:b2"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <target dev="tap0b83ac2a-72"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:40:45 np0005588919 podman[253246]: 2026-01-20 14:40:45.053823316 +0000 UTC m=+0.143028492 container cleanup ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:40:45 np0005588919 systemd[1]: libpod-conmon-ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48.scope: Deactivated successfully.
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.063 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:e7:0e:b2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b83ac2a-72"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.067 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:e7:0e:b2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0b83ac2a-72"/></interface>not found in domain: <domain type='kvm'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <name>instance-00000042</name>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <uuid>d5c2df9d-748f-4df2-9392-b45741975f65</uuid>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:40:45</nova:creationTime>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <nova:port uuid="0b83ac2a-727a-4db9-91f2-69f939deeb69">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='serial'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='uuid'>d5c2df9d-748f-4df2-9392-b45741975f65</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='partial'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <model fallback='allow'>Nehalem</model>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/d5c2df9d-748f-4df2-9392-b45741975f65_disk.config'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:3b:35:f2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='tapb48170b0-71'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:d2:51:f5'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target dev='tap5fd7b3ad-0c'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <console type='pty'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65/console.log' append='off'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.067 225859 INFO nova.virt.libvirt.driver [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Successfully detached device tap0b83ac2a-72 from instance d5c2df9d-748f-4df2-9392-b45741975f65 from the persistent domain config.#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.068 225859 DEBUG nova.virt.libvirt.vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:39:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-2088149366',display_name='tempest-AttachInterfacesTestJSON-server-2088149366',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-2088149366',id=66,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZwLcyXdmuPk9iAZlMOAxeFM3EdHKE0x5nJT3i2GTbVf6EkhYVj3hEmoeSwYo6iZrNjT6w/g2TndK4CzLIvGDWLEyKfIPgg2vbEtoL1oIxCHYN2ytrctbkHi1netydaRQ==',key_name='tempest-keypair-230916378',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:39:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-odubqy1o',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:40:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=d5c2df9d-748f-4df2-9392-b45741975f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.068 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.069 225859 DEBUG nova.network.os_vif_util [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.069 225859 DEBUG os_vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.071 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b83ac2a-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.071 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.074 225859 INFO os_vif [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:0e:b2,bridge_name='br-int',has_traffic_filtering=True,id=0b83ac2a-727a-4db9-91f2-69f939deeb69,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0b83ac2a-72')#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.074 225859 DEBUG nova.virt.libvirt.guest [req-7399bf5a-a499-4702-9d57-c40b71169064 req-d8aeda9a-d9d9-43f9-a60a-73779b0052ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:name>tempest-AttachInterfacesTestJSON-server-2088149366</nova:name>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:40:45</nova:creationTime>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:port uuid="b48170b0-717d-48f0-8172-742a4a8596e9">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    <nova:port uuid="5fd7b3ad-0cf1-4294-b552-6141c8ee85bd">
Jan 20 09:40:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:40:45 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:40:45 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.091 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 podman[253331]: 2026-01-20 14:40:45.121948196 +0000 UTC m=+0.043286117 container remove ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.127 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec95672-f09f-41f6-8e4d-fdf15d813388]: (4, ('Tue Jan 20 02:40:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48)\nad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48\nTue Jan 20 02:40:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (ad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48)\nad0796275d2b7bc69c4b451ca13c4323d2608039a8c7168dc31ea54a35ab1b48\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.129 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ad67f2ff-3e87-447c-970e-daec036d7f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.130 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.132 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 kernel: tapfc21b99b-40: left promiscuous mode
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.146 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[03a168af-53f3-46e7-8061-9c10f4a88c71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.161 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[243a3683-dd8c-4876-a374-25edb5085c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.162 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a3228c96-507a-4ae0-8372-d88d6220775b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.175 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[43009149-b2cf-476a-bed4-7f191148194c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503226, 'reachable_time': 16842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253346, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 systemd[1]: run-netns-ovnmeta\x2dfc21b99b\x2d4e34\x2d422c\x2dbe05\x2d0a440009dac4.mount: Deactivated successfully.
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.180 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:40:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:45.180 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c487bf-c31c-4f1a-87d8-680f0e0e1f21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.601 225859 INFO nova.virt.libvirt.driver [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Deleting instance files /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65_del#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.602 225859 INFO nova.virt.libvirt.driver [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Deletion of /var/lib/nova/instances/d5c2df9d-748f-4df2-9392-b45741975f65_del complete#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.678 225859 INFO nova.compute.manager [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Took 1.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.678 225859 DEBUG oslo.service.loopingcall [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.679 225859 DEBUG nova.compute.manager [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:40:45 np0005588919 nova_compute[225855]: 2026-01-20 14:40:45.679 225859 DEBUG nova.network.neutron [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:40:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:46.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.817 225859 DEBUG nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-unplugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.818 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.818 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.818 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.819 225859 DEBUG nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-unplugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.819 225859 DEBUG nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-unplugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.819 225859 DEBUG nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.820 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.820 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.821 225859 DEBUG oslo_concurrency.lockutils [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.821 225859 DEBUG nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:46 np0005588919 nova_compute[225855]: 2026-01-20 14:40:46.821 225859 WARNING nova.compute.manager [req-12e19205-4bc9-469f-a9ce-97f55419cab9 req-bf857b2b-91fb-4110-a3b1-4c04c58c55b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:47.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.055 225859 DEBUG nova.compute.manager [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.055 225859 DEBUG oslo_concurrency.lockutils [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.055 225859 DEBUG oslo_concurrency.lockutils [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.056 225859 DEBUG oslo_concurrency.lockutils [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.056 225859 DEBUG nova.compute.manager [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] No waiting events found dispatching network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.056 225859 WARNING nova.compute.manager [req-ce278299-790c-4085-84d7-681da752ac54 req-a5491507-0d5f-468b-917b-f4c7c62f5808 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received unexpected event network-vif-plugged-b48170b0-717d-48f0-8172-742a4a8596e9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.126 225859 DEBUG nova.network.neutron [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [{"id": "b48170b0-717d-48f0-8172-742a4a8596e9", "address": "fa:16:3e:3b:35:f2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb48170b0-71", "ovs_interfaceid": "b48170b0-717d-48f0-8172-742a4a8596e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "address": "fa:16:3e:d2:51:f5", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5fd7b3ad-0c", "ovs_interfaceid": "5fd7b3ad-0cf1-4294-b552-6141c8ee85bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "address": "fa:16:3e:e7:0e:b2", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b83ac2a-72", "ovs_interfaceid": "0b83ac2a-727a-4db9-91f2-69f939deeb69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.159 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-d5c2df9d-748f-4df2-9392-b45741975f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.183 225859 DEBUG oslo_concurrency.lockutils [None req-72b0302c-c7bb-4648-801b-9d4e27b907e8 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-d5c2df9d-748f-4df2-9392-b45741975f65-db46acd4-809b-4127-ad48-870ae429b4d6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.881 225859 DEBUG nova.network.neutron [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.900 225859 INFO nova.compute.manager [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Took 2.22 seconds to deallocate network for instance.#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.945 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:47 np0005588919 nova_compute[225855]: 2026-01-20 14:40:47.945 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:48.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.019 225859 DEBUG oslo_concurrency.processutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:40:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:40:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2116215996' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.453 225859 DEBUG oslo_concurrency.processutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.458 225859 DEBUG nova.compute.provider_tree [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.475 225859 DEBUG nova.scheduler.client.report [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.496 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.526 225859 INFO nova.scheduler.client.report [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Deleted allocations for instance d5c2df9d-748f-4df2-9392-b45741975f65#033[00m
Jan 20 09:40:48 np0005588919 nova_compute[225855]: 2026-01-20 14:40:48.589 225859 DEBUG oslo_concurrency.lockutils [None req-632ac997-4d58-4786-bcf9-7342e361eb4b c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "d5c2df9d-748f-4df2-9392-b45741975f65" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:49.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:49 np0005588919 nova_compute[225855]: 2026-01-20 14:40:49.330 225859 DEBUG nova.compute.manager [req-d292c7d6-0178-4d96-acc3-33b91add103f req-b867e9a1-97bf-4bb4-a5bf-35f088040c66 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-deleted-b48170b0-717d-48f0-8172-742a4a8596e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:49 np0005588919 nova_compute[225855]: 2026-01-20 14:40:49.330 225859 DEBUG nova.compute.manager [req-d292c7d6-0178-4d96-acc3-33b91add103f req-b867e9a1-97bf-4bb4-a5bf-35f088040c66 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Received event network-vif-deleted-5fd7b3ad-0cf1-4294-b552-6141c8ee85bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:40:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:50.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:50 np0005588919 nova_compute[225855]: 2026-01-20 14:40:50.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:50 np0005588919 nova_compute[225855]: 2026-01-20 14:40:50.095 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:51.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:51 np0005588919 podman[253424]: 2026-01-20 14:40:51.08536675 +0000 UTC m=+0.131516826 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:40:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:52.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:53.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:53 np0005588919 nova_compute[225855]: 2026-01-20 14:40:53.321 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:54.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:55 np0005588919 nova_compute[225855]: 2026-01-20 14:40:55.023 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:55.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:55 np0005588919 nova_compute[225855]: 2026-01-20 14:40:55.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:55Z|00231|binding|INFO|Releasing lport 8cff17f5-b792-4e6f-8f1e-6c48322af961 from this chassis (sb_readonly=0)
Jan 20 09:40:55 np0005588919 nova_compute[225855]: 2026-01-20 14:40:55.180 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:56.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:40:56 np0005588919 nova_compute[225855]: 2026-01-20 14:40:56.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:40:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:57.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:40:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:40:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:40:58.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.372 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.373 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.373 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.373 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.374 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.375 225859 INFO nova.compute.manager [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Terminating instance#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.376 225859 DEBUG nova.compute.manager [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:40:58 np0005588919 kernel: tap19a89daa-77 (unregistering): left promiscuous mode
Jan 20 09:40:58 np0005588919 NetworkManager[49104]: <info>  [1768920058.4257] device (tap19a89daa-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00232|binding|INFO|Releasing lport 19a89daa-770c-4c3f-970c-a9a462503b06 from this chassis (sb_readonly=0)
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00233|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 down in Southbound
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00234|binding|INFO|Removing iface tap19a89daa-77 ovn-installed in OVS
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.441 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:c6:66 10.100.0.12'], port_security=['fa:16:3e:7c:c6:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a96ccadd-ac1d-4040-8bcc-bebb460ee233', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b50ce2f25e8943e28ddf8bf69c721e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b713641f-ba5a-4dd0-8917-baab1ca61007', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01ad0ee9-ef23-4072-99f7-406bf8559611, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=19a89daa-770c-4c3f-970c-a9a462503b06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.442 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 19a89daa-770c-4c3f-970c-a9a462503b06 in datapath 1002188b-c6a7-4b59-9326-3a1a837a00fd unbound from our chassis#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.445 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1002188b-c6a7-4b59-9326-3a1a837a00fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.446 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42ea4ee2-2136-4154-a590-922b5a6f3430]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.447 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd namespace which is not needed anymore#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.458 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 20 09:40:58 np0005588919 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000044.scope: Consumed 14.189s CPU time.
Jan 20 09:40:58 np0005588919 systemd-machined[194361]: Machine qemu-30-instance-00000044 terminated.
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [NOTICE]   (252836) : haproxy version is 2.8.14-c23fe91
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [NOTICE]   (252836) : path to executable is /usr/sbin/haproxy
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [WARNING]  (252836) : Exiting Master process...
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [WARNING]  (252836) : Exiting Master process...
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [ALERT]    (252836) : Current worker (252838) exited with code 143 (Terminated)
Jan 20 09:40:58 np0005588919 neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd[252832]: [WARNING]  (252836) : All workers exited. Exiting... (0)
Jan 20 09:40:58 np0005588919 systemd[1]: libpod-38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b.scope: Deactivated successfully.
Jan 20 09:40:58 np0005588919 podman[253480]: 2026-01-20 14:40:58.570435262 +0000 UTC m=+0.045292004 container died 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 09:40:58 np0005588919 kernel: tap19a89daa-77: entered promiscuous mode
Jan 20 09:40:58 np0005588919 NetworkManager[49104]: <info>  [1768920058.5968] manager: (tap19a89daa-77): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00235|binding|INFO|Claiming lport 19a89daa-770c-4c3f-970c-a9a462503b06 for this chassis.
Jan 20 09:40:58 np0005588919 kernel: tap19a89daa-77 (unregistering): left promiscuous mode
Jan 20 09:40:58 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b-userdata-shm.mount: Deactivated successfully.
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.599 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00236|binding|INFO|19a89daa-770c-4c3f-970c-a9a462503b06: Claiming fa:16:3e:7c:c6:66 10.100.0.12
Jan 20 09:40:58 np0005588919 systemd[1]: var-lib-containers-storage-overlay-fc585d326ff1df1a671eac1b099fad631006b99f0da45b11e11943fba074a4f1-merged.mount: Deactivated successfully.
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.607 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:c6:66 10.100.0.12'], port_security=['fa:16:3e:7c:c6:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a96ccadd-ac1d-4040-8bcc-bebb460ee233', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b50ce2f25e8943e28ddf8bf69c721e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b713641f-ba5a-4dd0-8917-baab1ca61007', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01ad0ee9-ef23-4072-99f7-406bf8559611, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=19a89daa-770c-4c3f-970c-a9a462503b06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:58 np0005588919 podman[253480]: 2026-01-20 14:40:58.614557521 +0000 UTC m=+0.089414253 container cleanup 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00237|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 ovn-installed in OVS
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00238|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 up in Southbound
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00239|binding|INFO|Releasing lport 19a89daa-770c-4c3f-970c-a9a462503b06 from this chassis (sb_readonly=1)
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.622 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.624 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 systemd[1]: libpod-conmon-38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b.scope: Deactivated successfully.
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00240|if_status|INFO|Dropped 2 log messages in last 563 seconds (most recently, 563 seconds ago) due to excessive rate
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00241|if_status|INFO|Not setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 down as sb is readonly
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00242|binding|INFO|Removing iface tap19a89daa-77 ovn-installed in OVS
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.627 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00243|binding|INFO|Releasing lport 19a89daa-770c-4c3f-970c-a9a462503b06 from this chassis (sb_readonly=0)
Jan 20 09:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:40:58Z|00244|binding|INFO|Setting lport 19a89daa-770c-4c3f-970c-a9a462503b06 down in Southbound
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.631 225859 INFO nova.virt.libvirt.driver [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Instance destroyed successfully.#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.632 225859 DEBUG nova.objects.instance [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lazy-loading 'resources' on Instance uuid a96ccadd-ac1d-4040-8bcc-bebb460ee233 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.637 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:c6:66 10.100.0.12'], port_security=['fa:16:3e:7c:c6:66 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a96ccadd-ac1d-4040-8bcc-bebb460ee233', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b50ce2f25e8943e28ddf8bf69c721e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b713641f-ba5a-4dd0-8917-baab1ca61007', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01ad0ee9-ef23-4072-99f7-406bf8559611, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=19a89daa-770c-4c3f-970c-a9a462503b06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.639 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.651 225859 DEBUG nova.virt.libvirt.vif [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:40:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=68,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbOWirsAXlTIoevt4kXzpqBpapeg8X6KpUPmzDXnXlw7wqoLKHnmHfUIYL+FmHPJoWs+SV643EEJY+tqAkcrZlCPnWit4UcMgPhE0LGoYJ6xDnxZGwNzSj5VV503kGh5A==',key_name='tempest-keypair-1553012660',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:40:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b50ce2f25e8943e28ddf8bf69c721e75',ramdisk_id='',reservation_id='r-w26et9o4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-626473092',owner_user_name='tempest-ServersTestFqdnHostnames-626473092-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:40:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='16c05e1ac16f428bab6b36346856235e',uuid=a96ccadd-ac1d-4040-8bcc-bebb460ee233,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.651 225859 DEBUG nova.network.os_vif_util [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converting VIF {"id": "19a89daa-770c-4c3f-970c-a9a462503b06", "address": "fa:16:3e:7c:c6:66", "network": {"id": "1002188b-c6a7-4b59-9326-3a1a837a00fd", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1606456759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b50ce2f25e8943e28ddf8bf69c721e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19a89daa-77", "ovs_interfaceid": "19a89daa-770c-4c3f-970c-a9a462503b06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.652 225859 DEBUG nova.network.os_vif_util [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.653 225859 DEBUG os_vif [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.655 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.655 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19a89daa-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.658 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.661 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.663 225859 INFO os_vif [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:c6:66,bridge_name='br-int',has_traffic_filtering=True,id=19a89daa-770c-4c3f-970c-a9a462503b06,network=Network(1002188b-c6a7-4b59-9326-3a1a837a00fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap19a89daa-77')#033[00m
Jan 20 09:40:58 np0005588919 podman[253518]: 2026-01-20 14:40:58.688428774 +0000 UTC m=+0.047292891 container remove 38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.694 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[328d8f43-1ec1-4f71-bcf1-26c25031cf0d]: (4, ('Tue Jan 20 02:40:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd (38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b)\n38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b\nTue Jan 20 02:40:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd (38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b)\n38da31d8f51b431b660e3b2370ac55baf90389fb1943ecc974f1f1032615fe8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.697 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f47ca17-3a06-4e59-b9a1-bd8d89bea4de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.698 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1002188b-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.700 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 kernel: tap1002188b-c0: left promiscuous mode
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.714 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 nova_compute[225855]: 2026-01-20 14:40:58.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.719 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[03f9dd7c-5b72-437a-9b0c-3154ba9d8345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.735 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[52014f26-3a88-49f2-8131-fec56d35c989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.736 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[527a58e4-14e0-43fe-a14d-5b9c34fe93e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8513b0-0b13-4f3b-8cc6-0c9e4f0f642f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506133, 'reachable_time': 28256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253552, 'error': None, 'target': 'ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.755 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1002188b-c6a7-4b59-9326-3a1a837a00fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.755 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8bf944-4b43-4e35-9f7f-4dd28580bf9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 systemd[1]: run-netns-ovnmeta\x2d1002188b\x2dc6a7\x2d4b59\x2d9326\x2d3a1a837a00fd.mount: Deactivated successfully.
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.756 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 19a89daa-770c-4c3f-970c-a9a462503b06 in datapath 1002188b-c6a7-4b59-9326-3a1a837a00fd unbound from our chassis#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.757 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1002188b-c6a7-4b59-9326-3a1a837a00fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.758 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ce70a652-52df-4ced-b4b8-1fdb54f3e32b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.759 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 19a89daa-770c-4c3f-970c-a9a462503b06 in datapath 1002188b-c6a7-4b59-9326-3a1a837a00fd unbound from our chassis#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.760 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1002188b-c6a7-4b59-9326-3a1a837a00fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:40:58.760 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed4767a-6f60-4cae-be1c-0ab63b2808e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:40:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:40:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:40:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:40:59.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.594 225859 INFO nova.virt.libvirt.driver [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Deleting instance files /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233_del#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.595 225859 INFO nova.virt.libvirt.driver [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Deletion of /var/lib/nova/instances/a96ccadd-ac1d-4040-8bcc-bebb460ee233_del complete#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.671 225859 INFO nova.compute.manager [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Took 1.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.672 225859 DEBUG oslo.service.loopingcall [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.673 225859 DEBUG nova.compute.manager [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.673 225859 DEBUG nova.network.neutron [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.961 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920044.959884, d5c2df9d-748f-4df2-9392-b45741975f65 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.961 225859 INFO nova.compute.manager [-] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:40:59 np0005588919 nova_compute[225855]: 2026-01-20 14:40:59.982 225859 DEBUG nova.compute.manager [None req-58decafb-bca1-4b05-8ad7-5b2b8002b273 - - - - - -] [instance: d5c2df9d-748f-4df2-9392-b45741975f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:00.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:00 np0005588919 nova_compute[225855]: 2026-01-20 14:41:00.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:00 np0005588919 podman[253555]: 2026-01-20 14:41:00.997578212 +0000 UTC m=+0.049273667 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 20 09:41:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:01.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.352 225859 DEBUG nova.network.neutron [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.372 225859 INFO nova.compute.manager [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Took 1.70 seconds to deallocate network for instance.#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.411 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.412 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.419 225859 DEBUG nova.compute.manager [req-8848f8c5-268b-40f6-8157-e55460de6431 req-631e4395-def5-4278-a6a4-121d4c2de87e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-vif-deleted-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.465 225859 DEBUG oslo_concurrency.processutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.903 225859 DEBUG nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-vif-unplugged-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.903 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.904 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.904 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.904 225859 DEBUG nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] No waiting events found dispatching network-vif-unplugged-19a89daa-770c-4c3f-970c-a9a462503b06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.904 225859 WARNING nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received unexpected event network-vif-unplugged-19a89daa-770c-4c3f-970c-a9a462503b06 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.905 225859 DEBUG nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.905 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.905 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.905 225859 DEBUG oslo_concurrency.lockutils [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.906 225859 DEBUG nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] No waiting events found dispatching network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:01 np0005588919 nova_compute[225855]: 2026-01-20 14:41:01.906 225859 WARNING nova.compute.manager [req-ea29a9fe-b5eb-4759-8019-065d6deed0e3 req-de6e0c84-ad56-4ed0-a140-8f7b60b4983d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Received unexpected event network-vif-plugged-19a89daa-770c-4c3f-970c-a9a462503b06 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:41:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3514088597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.004 225859 DEBUG oslo_concurrency.processutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.014 225859 DEBUG nova.compute.provider_tree [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.030 225859 DEBUG nova.scheduler.client.report [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.056 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.086 225859 INFO nova.scheduler.client.report [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Deleted allocations for instance a96ccadd-ac1d-4040-8bcc-bebb460ee233#033[00m
Jan 20 09:41:02 np0005588919 nova_compute[225855]: 2026-01-20 14:41:02.160 225859 DEBUG oslo_concurrency.lockutils [None req-61fb8fa8-a6ad-49b5-8d79-1cf27e5804e2 16c05e1ac16f428bab6b36346856235e b50ce2f25e8943e28ddf8bf69c721e75 - - default default] Lock "a96ccadd-ac1d-4040-8bcc-bebb460ee233" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:03.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:03 np0005588919 nova_compute[225855]: 2026-01-20 14:41:03.659 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:04 np0005588919 nova_compute[225855]: 2026-01-20 14:41:04.935 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:05.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:05 np0005588919 nova_compute[225855]: 2026-01-20 14:41:05.100 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:07.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:08 np0005588919 nova_compute[225855]: 2026-01-20 14:41:08.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588919 nova_compute[225855]: 2026-01-20 14:41:08.662 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:08 np0005588919 nova_compute[225855]: 2026-01-20 14:41:08.846 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:09.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:10.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:10 np0005588919 nova_compute[225855]: 2026-01-20 14:41:10.103 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:11.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:11 np0005588919 nova_compute[225855]: 2026-01-20 14:41:11.991 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:11 np0005588919 nova_compute[225855]: 2026-01-20 14:41:11.992 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.009 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:41:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:12.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.081 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.082 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.090 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.090 225859 INFO nova.compute.claims [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.312 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/228072690' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.837 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.842 225859 DEBUG nova.compute.provider_tree [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.857 225859 DEBUG nova.scheduler.client.report [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.880 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.881 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.924 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.924 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.947 225859 INFO nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:41:12 np0005588919 nova_compute[225855]: 2026-01-20 14:41:12.963 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:41:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:13.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.122 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.124 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.124 225859 INFO nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Creating image(s)#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.148 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.175 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.200 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.204 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.270 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.271 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.271 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.272 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.296 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.300 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.384 225859 DEBUG nova.policy [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:41:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:41:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3266538832' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:41:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:41:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3266538832' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.627 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920058.624851, a96ccadd-ac1d-4040-8bcc-bebb460ee233 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.628 225859 INFO nova.compute.manager [-] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.653 225859 DEBUG nova.compute.manager [None req-d96f56bf-bd82-4598-ab03-693ba0e99081 - - - - - -] [instance: a96ccadd-ac1d-4040-8bcc-bebb460ee233] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:13 np0005588919 nova_compute[225855]: 2026-01-20 14:41:13.664 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.490 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Successfully created port: 607e59a4-2a6b-424a-9413-be318079781e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.571 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.272s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.643 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] resizing rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.752 225859 DEBUG nova.objects.instance [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'migration_context' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.768 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.768 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Ensure instance console log exists: /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.769 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.769 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:14 np0005588919 nova_compute[225855]: 2026-01-20 14:41:14.769 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:15.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.104 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.387 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Successfully updated port: 607e59a4-2a6b-424a-9413-be318079781e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.408 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.409 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.409 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.482 225859 DEBUG nova.compute.manager [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-changed-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.483 225859 DEBUG nova.compute.manager [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing instance network info cache due to event network-changed-607e59a4-2a6b-424a-9413-be318079781e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.483 225859 DEBUG oslo_concurrency.lockutils [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:15 np0005588919 nova_compute[225855]: 2026-01-20 14:41:15.581 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:41:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:16.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:16.401 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:16.402 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:16.402 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.859 225859 DEBUG nova.network.neutron [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.973 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.973 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Instance network_info: |[{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.974 225859 DEBUG oslo_concurrency.lockutils [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.974 225859 DEBUG nova.network.neutron [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing network info cache for port 607e59a4-2a6b-424a-9413-be318079781e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.976 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Start _get_guest_xml network_info=[{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.981 225859 WARNING nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.987 225859 DEBUG nova.virt.libvirt.host [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.988 225859 DEBUG nova.virt.libvirt.host [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.994 225859 DEBUG nova.virt.libvirt.host [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.995 225859 DEBUG nova.virt.libvirt.host [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.996 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.996 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.996 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.997 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.997 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.997 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.997 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.997 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.998 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.998 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.998 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:41:16 np0005588919 nova_compute[225855]: 2026-01-20 14:41:16.998 225859 DEBUG nova.virt.hardware [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:41:17 np0005588919 nova_compute[225855]: 2026-01-20 14:41:17.000 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:17.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3958666232' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:17 np0005588919 nova_compute[225855]: 2026-01-20 14:41:17.517 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:17 np0005588919 nova_compute[225855]: 2026-01-20 14:41:17.558 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:17 np0005588919 nova_compute[225855]: 2026-01-20 14:41:17.564 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:18.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:41:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3232206658' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.214 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.650s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.216 225859 DEBUG nova.virt.libvirt.vif [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.217 225859 DEBUG nova.network.os_vif_util [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.218 225859 DEBUG nova.network.os_vif_util [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.220 225859 DEBUG nova.objects.instance [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_devices' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.255 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <uuid>10349dde-fb60-48ba-bc7b-42180c5eb49e</uuid>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <name>instance-00000047</name>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:name>tempest-tempest.common.compute-instance-689127716</nova:name>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:41:16</nova:creationTime>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <nova:port uuid="607e59a4-2a6b-424a-9413-be318079781e">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="serial">10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="uuid">10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a3:13:21"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <target dev="tap607e59a4-2a"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log" append="off"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:41:18 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:41:18 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:41:18 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:41:18 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.257 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Preparing to wait for external event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.258 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.258 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.259 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.259 225859 DEBUG nova.virt.libvirt.vif [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:41:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.260 225859 DEBUG nova.network.os_vif_util [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.261 225859 DEBUG nova.network.os_vif_util [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.261 225859 DEBUG os_vif [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.262 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.263 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.267 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap607e59a4-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.267 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap607e59a4-2a, col_values=(('external_ids', {'iface-id': '607e59a4-2a6b-424a-9413-be318079781e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:13:21', 'vm-uuid': '10349dde-fb60-48ba-bc7b-42180c5eb49e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588919 NetworkManager[49104]: <info>  [1768920078.2711] manager: (tap607e59a4-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.280 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.282 225859 INFO os_vif [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a')#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.361 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.361 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.361 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:a3:13:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.362 225859 INFO nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Using config drive#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.388 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.847 225859 INFO nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Creating config drive at /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.853 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1l_ef0yz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.917 225859 DEBUG nova.network.neutron [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated VIF entry in instance network info cache for port 607e59a4-2a6b-424a-9413-be318079781e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.920 225859 DEBUG nova.network.neutron [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.936 225859 DEBUG oslo_concurrency.lockutils [req-754c8f91-b200-4e56-bc54-423c28a7d0e6 req-17fd8461-a98b-4b1e-ae13-5f4b6853dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:18 np0005588919 nova_compute[225855]: 2026-01-20 14:41:18.992 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1l_ef0yz" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:19.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:19 np0005588919 nova_compute[225855]: 2026-01-20 14:41:19.305 225859 DEBUG nova.storage.rbd_utils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] rbd image 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:41:19 np0005588919 nova_compute[225855]: 2026-01-20 14:41:19.311 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:19 np0005588919 nova_compute[225855]: 2026-01-20 14:41:19.963 225859 DEBUG oslo_concurrency.processutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config 10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.652s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:19 np0005588919 nova_compute[225855]: 2026-01-20 14:41:19.964 225859 INFO nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Deleting local config drive /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/disk.config because it was imported into RBD.#033[00m
Jan 20 09:41:20 np0005588919 kernel: tap607e59a4-2a: entered promiscuous mode
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.0250] manager: (tap607e59a4-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:20Z|00245|binding|INFO|Claiming lport 607e59a4-2a6b-424a-9413-be318079781e for this chassis.
Jan 20 09:41:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:20Z|00246|binding|INFO|607e59a4-2a6b-424a-9413-be318079781e: Claiming fa:16:3e:a3:13:21 10.100.0.11
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.036 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.047 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.0476] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.0482] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.051 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:13:21 10.100.0.11'], port_security=['fa:16:3e:a3:13:21 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '10349dde-fb60-48ba-bc7b-42180c5eb49e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '2', 'neutron:security_group_ids': '52b08fd6-6aa8-4470-b89c-ece04e1c959e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=607e59a4-2a6b-424a-9413-be318079781e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.052 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 607e59a4-2a6b-424a-9413-be318079781e in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:41:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:20.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.053 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:41:20 np0005588919 systemd-udevd[253982]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:41:20 np0005588919 systemd-machined[194361]: New machine qemu-31-instance-00000047.
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.063 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa59615-95ad-4046-86cf-aae7f2ced5de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.064 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc21b99b-41 in ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.065 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc21b99b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.066 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e906596-dec2-423a-a420-ebd7fa0534f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.066 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ced6eb-47a3-4569-bfa2-3ed6f26ccf7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.0700] device (tap607e59a4-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.0708] device (tap607e59a4-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.078 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[483ce132-d68b-44c3-9cda-63a286f3bb32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 systemd[1]: Started Virtual Machine qemu-31-instance-00000047.
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.093 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b92f68ed-6aa4-4b5b-b4c1-e5375f178abf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.121 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[274a6c53-cbec-47fe-80b1-36bbe2530dca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.1316] manager: (tapfc21b99b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.132 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[40493834-b92d-4d2c-b234-cc149328c0d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.162 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd447ba-c0f7-4b8d-95b1-49780b5ca043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.165 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[06e04867-5d63-42b7-8198-d3b0a5a9ca06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.1859] device (tapfc21b99b-40): carrier: link connected
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.191 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3410494c-def0-4e0a-b52d-27560d28cea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.206 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab85a6a-ab19-4b36-86e3-3ce6c1f841a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511911, 'reachable_time': 34363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254015, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.217 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0eecda25-8917-4782-b694-a71b63fbf87e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:5bd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511911, 'tstamp': 511911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254016, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.230 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed596e8-26ef-41fc-aa56-368ab3d1c7e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511911, 'reachable_time': 34363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254017, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.250 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.254 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[88dda050-147b-4a67-be37-72987a024285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:20Z|00247|binding|INFO|Setting lport 607e59a4-2a6b-424a-9413-be318079781e ovn-installed in OVS
Jan 20 09:41:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:20Z|00248|binding|INFO|Setting lport 607e59a4-2a6b-424a-9413-be318079781e up in Southbound
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.301 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc998151-a542-4cc8-ae54-01ac2f74bc68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.302 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.303 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.303 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 NetworkManager[49104]: <info>  [1768920080.3060] manager: (tapfc21b99b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 20 09:41:20 np0005588919 kernel: tapfc21b99b-40: entered promiscuous mode
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.307 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.309 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:20Z|00249|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.311 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.312 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[06e93ae3-fed4-4eb1-9258-c5d93eee8db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.312 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/fc21b99b-4e34-422c-be05-0a440009dac4.pid.haproxy
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID fc21b99b-4e34-422c-be05-0a440009dac4
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:41:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:20.314 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'env', 'PROCESS_TAG=haproxy-fc21b99b-4e34-422c-be05-0a440009dac4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc21b99b-4e34-422c-be05-0a440009dac4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.323 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.682 225859 DEBUG nova.compute.manager [req-97f26a88-04a6-4d15-bd2c-927530895d71 req-e771d66a-d08e-42f1-b6b8-c785335a8e35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.683 225859 DEBUG oslo_concurrency.lockutils [req-97f26a88-04a6-4d15-bd2c-927530895d71 req-e771d66a-d08e-42f1-b6b8-c785335a8e35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.683 225859 DEBUG oslo_concurrency.lockutils [req-97f26a88-04a6-4d15-bd2c-927530895d71 req-e771d66a-d08e-42f1-b6b8-c785335a8e35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.683 225859 DEBUG oslo_concurrency.lockutils [req-97f26a88-04a6-4d15-bd2c-927530895d71 req-e771d66a-d08e-42f1-b6b8-c785335a8e35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:20 np0005588919 nova_compute[225855]: 2026-01-20 14:41:20.683 225859 DEBUG nova.compute.manager [req-97f26a88-04a6-4d15-bd2c-927530895d71 req-e771d66a-d08e-42f1-b6b8-c785335a8e35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Processing event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:41:20 np0005588919 podman[254065]: 2026-01-20 14:41:20.63913122 +0000 UTC m=+0.022329573 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:41:20 np0005588919 podman[254065]: 2026-01-20 14:41:20.787185833 +0000 UTC m=+0.170384186 container create 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:41:20 np0005588919 systemd[1]: Started libpod-conmon-2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906.scope.
Jan 20 09:41:20 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:41:20 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b359ef85c164591e81dbb650ad14fe7eb3bd9cd95784fcc030b2336836287bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:41:20 np0005588919 podman[254065]: 2026-01-20 14:41:20.892356892 +0000 UTC m=+0.275555255 container init 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:41:20 np0005588919 podman[254065]: 2026-01-20 14:41:20.898856586 +0000 UTC m=+0.282054919 container start 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:41:20 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [NOTICE]   (254084) : New worker (254086) forked
Jan 20 09:41:20 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [NOTICE]   (254084) : Loading success.
Jan 20 09:41:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:21.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:21 np0005588919 nova_compute[225855]: 2026-01-20 14:41:21.767 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:22.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:22 np0005588919 podman[254105]: 2026-01-20 14:41:22.059625251 +0000 UTC m=+0.102798942 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.326 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.326 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920082.3255358, 10349dde-fb60-48ba-bc7b-42180c5eb49e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.327 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] VM Started (Lifecycle Event)#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.330 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.333 225859 INFO nova.virt.libvirt.driver [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Instance spawned successfully.#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.334 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.438 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.439 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.439 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.440 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.440 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.441 225859 DEBUG nova.virt.libvirt.driver [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.546 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.550 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.603 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.604 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920082.326506, 10349dde-fb60-48ba-bc7b-42180c5eb49e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.604 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.613 225859 INFO nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Took 9.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.614 225859 DEBUG nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.665 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.668 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920082.3298385, 10349dde-fb60-48ba-bc7b-42180c5eb49e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.668 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.700 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.703 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.715 225859 INFO nova.compute.manager [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Took 10.67 seconds to build instance.#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.783 225859 DEBUG nova.compute.manager [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.783 225859 DEBUG oslo_concurrency.lockutils [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.784 225859 DEBUG oslo_concurrency.lockutils [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.784 225859 DEBUG oslo_concurrency.lockutils [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.784 225859 DEBUG nova.compute.manager [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] No waiting events found dispatching network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.784 225859 WARNING nova.compute.manager [req-20cbb18e-a79c-4c08-9e97-6ba51a96e498 req-1d29f054-4d6a-4ccf-ad09-a43db0bef1a6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received unexpected event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e for instance with vm_state active and task_state None.#033[00m
Jan 20 09:41:22 np0005588919 nova_compute[225855]: 2026-01-20 14:41:22.837 225859 DEBUG oslo_concurrency.lockutils [None req-66e94ac8-2e6a-4837-b9f0-dea06fd86b45 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:41:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:23.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:41:23 np0005588919 nova_compute[225855]: 2026-01-20 14:41:23.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:24.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:24 np0005588919 nova_compute[225855]: 2026-01-20 14:41:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:25 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:25Z|00250|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:41:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:25 np0005588919 nova_compute[225855]: 2026-01-20 14:41:25.160 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:25 np0005588919 nova_compute[225855]: 2026-01-20 14:41:25.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:26.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:26 np0005588919 nova_compute[225855]: 2026-01-20 14:41:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:27.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.510 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.510 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.511 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.641 225859 DEBUG nova.compute.manager [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-changed-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.642 225859 DEBUG nova.compute.manager [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing instance network info cache due to event network-changed-607e59a4-2a6b-424a-9413-be318079781e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.642 225859 DEBUG oslo_concurrency.lockutils [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.643 225859 DEBUG oslo_concurrency.lockutils [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:27 np0005588919 nova_compute[225855]: 2026-01-20 14:41:27.643 225859 DEBUG nova.network.neutron [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing network info cache for port 607e59a4-2a6b-424a-9413-be318079781e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:41:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:28.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:28 np0005588919 nova_compute[225855]: 2026-01-20 14:41:28.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:28 np0005588919 nova_compute[225855]: 2026-01-20 14:41:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:29 np0005588919 nova_compute[225855]: 2026-01-20 14:41:29.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:30.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:30 np0005588919 nova_compute[225855]: 2026-01-20 14:41:30.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:31.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.164 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.165 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.165 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.165 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.166 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:31 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2992877323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.632 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:31 np0005588919 podman[254226]: 2026-01-20 14:41:31.740672433 +0000 UTC m=+0.057040427 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.967 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000047 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:31 np0005588919 nova_compute[225855]: 2026-01-20 14:41:31.968 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000047 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:41:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:32.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.153 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.155 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4339MB free_disk=20.900901794433594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.155 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.156 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.306 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 10349dde-fb60-48ba-bc7b-42180c5eb49e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.307 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.307 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.359 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:41:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:41:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3706431175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.815 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:41:32 np0005588919 nova_compute[225855]: 2026-01-20 14:41:32.821 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:41:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:41:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:33.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:41:33 np0005588919 nova_compute[225855]: 2026-01-20 14:41:33.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:33 np0005588919 nova_compute[225855]: 2026-01-20 14:41:33.788 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:41:33 np0005588919 nova_compute[225855]: 2026-01-20 14:41:33.823 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:41:33 np0005588919 nova_compute[225855]: 2026-01-20 14:41:33.824 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:41:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:34.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:34 np0005588919 nova_compute[225855]: 2026-01-20 14:41:34.825 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:34 np0005588919 nova_compute[225855]: 2026-01-20 14:41:34.825 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:34 np0005588919 nova_compute[225855]: 2026-01-20 14:41:34.826 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:41:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:35.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:35 np0005588919 nova_compute[225855]: 2026-01-20 14:41:35.256 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:35Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:13:21 10.100.0.11
Jan 20 09:41:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:41:35Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:13:21 10.100.0.11
Jan 20 09:41:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:36.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:41:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:37.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:41:37 np0005588919 nova_compute[225855]: 2026-01-20 14:41:37.664 225859 DEBUG nova.network.neutron [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated VIF entry in instance network info cache for port 607e59a4-2a6b-424a-9413-be318079781e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:41:37 np0005588919 nova_compute[225855]: 2026-01-20 14:41:37.664 225859 DEBUG nova.network.neutron [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:37 np0005588919 nova_compute[225855]: 2026-01-20 14:41:37.693 225859 DEBUG oslo_concurrency.lockutils [req-e82df75d-adc9-42ff-8051-3eb4157c27e1 req-35cdfc53-66e4-41a2-811c-12b17c10ca97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:38.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:38 np0005588919 nova_compute[225855]: 2026-01-20 14:41:38.275 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:39.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:40 np0005588919 podman[254443]: 2026-01-20 14:41:40.017400582 +0000 UTC m=+0.526744210 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 20 09:41:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:41:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:41:40 np0005588919 podman[254443]: 2026-01-20 14:41:40.148391062 +0000 UTC m=+0.657734680 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 20 09:41:40 np0005588919 nova_compute[225855]: 2026-01-20 14:41:40.260 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:40 np0005588919 podman[254597]: 2026-01-20 14:41:40.752215902 +0000 UTC m=+0.061509223 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:41:40 np0005588919 podman[254597]: 2026-01-20 14:41:40.787342597 +0000 UTC m=+0.096635868 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:41:40 np0005588919 podman[254664]: 2026-01-20 14:41:40.975883597 +0000 UTC m=+0.052506228 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public)
Jan 20 09:41:40 np0005588919 podman[254664]: 2026-01-20 14:41:40.988208606 +0000 UTC m=+0.064831217 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, vcs-type=git, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, release=1793, distribution-scope=public, description=keepalived for Ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., name=keepalived)
Jan 20 09:41:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:41.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:41 np0005588919 nova_compute[225855]: 2026-01-20 14:41:41.991 225859 DEBUG nova.compute.manager [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-changed-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:41:41 np0005588919 nova_compute[225855]: 2026-01-20 14:41:41.992 225859 DEBUG nova.compute.manager [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing instance network info cache due to event network-changed-607e59a4-2a6b-424a-9413-be318079781e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:41:41 np0005588919 nova_compute[225855]: 2026-01-20 14:41:41.993 225859 DEBUG oslo_concurrency.lockutils [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:41:41 np0005588919 nova_compute[225855]: 2026-01-20 14:41:41.993 225859 DEBUG oslo_concurrency.lockutils [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:41:41 np0005588919 nova_compute[225855]: 2026-01-20 14:41:41.994 225859 DEBUG nova.network.neutron [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing network info cache for port 607e59a4-2a6b-424a-9413-be318079781e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:41:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:42.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:41:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:41:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:43.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:43 np0005588919 nova_compute[225855]: 2026-01-20 14:41:43.277 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:44.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:45.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:45 np0005588919 nova_compute[225855]: 2026-01-20 14:41:45.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:46.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:47.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:47 np0005588919 nova_compute[225855]: 2026-01-20 14:41:47.449 225859 DEBUG nova.network.neutron [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated VIF entry in instance network info cache for port 607e59a4-2a6b-424a-9413-be318079781e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:41:47 np0005588919 nova_compute[225855]: 2026-01-20 14:41:47.450 225859 DEBUG nova.network.neutron [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:41:47 np0005588919 nova_compute[225855]: 2026-01-20 14:41:47.785 225859 DEBUG oslo_concurrency.lockutils [req-91a6eb4a-2834-4ef0-8f1b-ac1af5dd43f0 req-0ec7a7ad-eecc-4259-8aec-3b4ac2c938ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:41:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:48.001 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:41:48 np0005588919 nova_compute[225855]: 2026-01-20 14:41:48.002 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:48.003 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:41:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:48.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:48 np0005588919 nova_compute[225855]: 2026-01-20 14:41:48.279 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:41:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:49.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:41:50.005 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:41:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:50.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:50 np0005588919 nova_compute[225855]: 2026-01-20 14:41:50.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:51.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:41:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:52.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:41:53 np0005588919 podman[254935]: 2026-01-20 14:41:53.075797756 +0000 UTC m=+0.117522049 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:41:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:53.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:53 np0005588919 nova_compute[225855]: 2026-01-20 14:41:53.281 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:54.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:55.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:55 np0005588919 nova_compute[225855]: 2026-01-20 14:41:55.268 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:56.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:41:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:41:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:57.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:41:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:41:58.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:41:58 np0005588919 nova_compute[225855]: 2026-01-20 14:41:58.284 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:41:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:41:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:41:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:41:59.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:00.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:00 np0005588919 nova_compute[225855]: 2026-01-20 14:42:00.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:01.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:02 np0005588919 podman[254966]: 2026-01-20 14:42:02.037109411 +0000 UTC m=+0.074156451 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:42:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:02.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:03.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:03 np0005588919 nova_compute[225855]: 2026-01-20 14:42:03.286 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:04.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:05.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:05 np0005588919 nova_compute[225855]: 2026-01-20 14:42:05.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:06.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:07.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.729708) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127729759, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2175, "num_deletes": 251, "total_data_size": 5092944, "memory_usage": 5161184, "flush_reason": "Manual Compaction"}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127757144, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3307047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36648, "largest_seqno": 38818, "table_properties": {"data_size": 3298227, "index_size": 5378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18936, "raw_average_key_size": 20, "raw_value_size": 3280465, "raw_average_value_size": 3554, "num_data_blocks": 234, "num_entries": 923, "num_filter_entries": 923, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768919948, "oldest_key_time": 1768919948, "file_creation_time": 1768920127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 27490 microseconds, and 7139 cpu microseconds.
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.757196) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3307047 bytes OK
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.757218) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.760654) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.760670) EVENT_LOG_v1 {"time_micros": 1768920127760665, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.760691) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5083216, prev total WAL file size 5083927, number of live WAL files 2.
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.762051) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3229KB)], [69(8242KB)]
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127762122, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11747401, "oldest_snapshot_seqno": -1}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6460 keys, 9846164 bytes, temperature: kUnknown
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127840272, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 9846164, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9803177, "index_size": 25725, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16197, "raw_key_size": 164948, "raw_average_key_size": 25, "raw_value_size": 9687475, "raw_average_value_size": 1499, "num_data_blocks": 1030, "num_entries": 6460, "num_filter_entries": 6460, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.840581) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 9846164 bytes
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.841777) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.2 rd, 125.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 8.0 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 6979, records dropped: 519 output_compression: NoCompression
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.841800) EVENT_LOG_v1 {"time_micros": 1768920127841789, "job": 42, "event": "compaction_finished", "compaction_time_micros": 78217, "compaction_time_cpu_micros": 28555, "output_level": 6, "num_output_files": 1, "total_output_size": 9846164, "num_input_records": 6979, "num_output_records": 6460, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127842891, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920127844922, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.761833) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.844967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.844971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.844973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.844975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:07 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:07.844977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:08.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:08Z|00251|binding|INFO|Releasing lport 583df905-1d9f-49c1-b209-4b7fad1599f6 from this chassis (sb_readonly=0)
Jan 20 09:42:08 np0005588919 nova_compute[225855]: 2026-01-20 14:42:08.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:08 np0005588919 nova_compute[225855]: 2026-01-20 14:42:08.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:09.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:10.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:10 np0005588919 nova_compute[225855]: 2026-01-20 14:42:10.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:11.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:12.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.425 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.426 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.466 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.606 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.607 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.616 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.616 225859 INFO nova.compute.claims [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:42:12 np0005588919 nova_compute[225855]: 2026-01-20 14:42:12.869 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:13.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.158929) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133159164, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 310, "num_deletes": 255, "total_data_size": 122037, "memory_usage": 128576, "flush_reason": "Manual Compaction"}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133162424, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 79974, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38823, "largest_seqno": 39128, "table_properties": {"data_size": 78047, "index_size": 155, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4925, "raw_average_key_size": 17, "raw_value_size": 74148, "raw_average_value_size": 264, "num_data_blocks": 7, "num_entries": 280, "num_filter_entries": 280, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920127, "oldest_key_time": 1768920127, "file_creation_time": 1768920133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 3648 microseconds, and 1252 cpu microseconds.
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.162587) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 79974 bytes OK
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.162665) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.164417) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.164432) EVENT_LOG_v1 {"time_micros": 1768920133164427, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.164453) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 119792, prev total WAL file size 119792, number of live WAL files 2.
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.165420) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303131' seq:72057594037927935, type:22 .. '6C6F676D0031323632' seq:0, type:0; will stop at (end)
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(78KB)], [72(9615KB)]
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133165581, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 9926138, "oldest_snapshot_seqno": -1}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6222 keys, 9792709 bytes, temperature: kUnknown
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133249768, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 9792709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9750793, "index_size": 25230, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 160936, "raw_average_key_size": 25, "raw_value_size": 9638771, "raw_average_value_size": 1549, "num_data_blocks": 1005, "num_entries": 6222, "num_filter_entries": 6222, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.250105) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 9792709 bytes
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.256947) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 117.8 rd, 116.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.4 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(246.6) write-amplify(122.4) OK, records in: 6740, records dropped: 518 output_compression: NoCompression
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.256998) EVENT_LOG_v1 {"time_micros": 1768920133256980, "job": 44, "event": "compaction_finished", "compaction_time_micros": 84294, "compaction_time_cpu_micros": 27824, "output_level": 6, "num_output_files": 1, "total_output_size": 9792709, "num_input_records": 6740, "num_output_records": 6222, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133257472, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920133259552, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.165342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259681) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:42:13.259702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.294 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2810974689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.392 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.398 225859 DEBUG nova.compute.provider_tree [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.432 225859 DEBUG nova.scheduler.client.report [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.476 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.477 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.534 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.534 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.568 225859 INFO nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.606 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.726 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.728 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.729 225859 INFO nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Creating image(s)#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.766 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.804 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.840 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.845 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.885 225859 DEBUG nova.policy [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff99fc8eda0640928c6e82981dacb266', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b95747114ab4043b93a260387199c91', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.932 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.933 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.934 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.934 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.964 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:13 np0005588919 nova_compute[225855]: 2026-01-20 14:42:13.968 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 504acd93-cd55-496e-a85f-30e811f827d4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:14.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.270 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 504acd93-cd55-496e-a85f-30e811f827d4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.352 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] resizing rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.464 225859 DEBUG nova.objects.instance [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'migration_context' on Instance uuid 504acd93-cd55-496e-a85f-30e811f827d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.508 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.508 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Ensure instance console log exists: /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.509 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.509 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:14 np0005588919 nova_compute[225855]: 2026-01-20 14:42:14.509 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:15 np0005588919 nova_compute[225855]: 2026-01-20 14:42:15.081 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Successfully created port: 349b1d10-0b06-4025-80fd-4861bd487a43 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:15.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:15 np0005588919 nova_compute[225855]: 2026-01-20 14:42:15.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:16.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:16.403 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:16.403 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:16.404 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:17.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:17 np0005588919 nova_compute[225855]: 2026-01-20 14:42:17.365 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Successfully updated port: 349b1d10-0b06-4025-80fd-4861bd487a43 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:17 np0005588919 nova_compute[225855]: 2026-01-20 14:42:17.410 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:17 np0005588919 nova_compute[225855]: 2026-01-20 14:42:17.410 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquired lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:17 np0005588919 nova_compute[225855]: 2026-01-20 14:42:17.411 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:17 np0005588919 nova_compute[225855]: 2026-01-20 14:42:17.716 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3967705155' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:18.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:18 np0005588919 nova_compute[225855]: 2026-01-20 14:42:18.297 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:19.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:19 np0005588919 nova_compute[225855]: 2026-01-20 14:42:19.578 225859 DEBUG nova.compute.manager [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-changed-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:19 np0005588919 nova_compute[225855]: 2026-01-20 14:42:19.579 225859 DEBUG nova.compute.manager [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing instance network info cache due to event network-changed-607e59a4-2a6b-424a-9413-be318079781e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:19 np0005588919 nova_compute[225855]: 2026-01-20 14:42:19.579 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:19 np0005588919 nova_compute[225855]: 2026-01-20 14:42:19.580 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:19 np0005588919 nova_compute[225855]: 2026-01-20 14:42:19.580 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing network info cache for port 607e59a4-2a6b-424a-9413-be318079781e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:20.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:20 np0005588919 nova_compute[225855]: 2026-01-20 14:42:20.210 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:20 np0005588919 nova_compute[225855]: 2026-01-20 14:42:20.211 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:20 np0005588919 nova_compute[225855]: 2026-01-20 14:42:20.212 225859 DEBUG nova.objects.instance [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:20 np0005588919 nova_compute[225855]: 2026-01-20 14:42:20.280 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:21.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.481 225859 DEBUG nova.network.neutron [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Updating instance_info_cache with network_info: [{"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.533 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Releasing lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.533 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Instance network_info: |[{"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.535 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Start _get_guest_xml network_info=[{"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '26699514-f465-4b50-98b7-36f2cfc6a308'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.539 225859 WARNING nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.545 225859 DEBUG nova.virt.libvirt.host [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.545 225859 DEBUG nova.virt.libvirt.host [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.550 225859 DEBUG nova.virt.libvirt.host [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.551 225859 DEBUG nova.virt.libvirt.host [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.552 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.553 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.553 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.553 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.554 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.554 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.555 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.555 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.555 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.556 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.556 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.556 225859 DEBUG nova.virt.hardware [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.560 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.616 225859 DEBUG nova.objects.instance [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'pci_requests' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:21 np0005588919 nova_compute[225855]: 2026-01-20 14:42:21.668 225859 DEBUG nova.network.neutron [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3110567127' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:22.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:22 np0005588919 nova_compute[225855]: 2026-01-20 14:42:22.296 225859 DEBUG nova.policy [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a9fb458d27434495a77a94827b6097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3f93fd4b2154dda9f38e62334904303', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:22 np0005588919 nova_compute[225855]: 2026-01-20 14:42:22.656 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:22 np0005588919 nova_compute[225855]: 2026-01-20 14:42:22.695 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:22 np0005588919 nova_compute[225855]: 2026-01-20 14:42:22.700 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:23.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2974363094' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.195 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.199 225859 DEBUG nova.virt.libvirt.vif [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1822690739',display_name='tempest-ListServerFiltersTestJSON-instance-1822690739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1822690739',id=75,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-c4vxbkd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:13Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=504acd93-cd55-496e-a85f-30e811f827d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.200 225859 DEBUG nova.network.os_vif_util [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.202 225859 DEBUG nova.network.os_vif_util [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.205 225859 DEBUG nova.objects.instance [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'pci_devices' on Instance uuid 504acd93-cd55-496e-a85f-30e811f827d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.225 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <uuid>504acd93-cd55-496e-a85f-30e811f827d4</uuid>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <name>instance-0000004b</name>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1822690739</nova:name>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:42:21</nova:creationTime>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:user uuid="ff99fc8eda0640928c6e82981dacb266">tempest-ListServerFiltersTestJSON-2126845308-project-member</nova:user>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:project uuid="4b95747114ab4043b93a260387199c91">tempest-ListServerFiltersTestJSON-2126845308</nova:project>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <nova:port uuid="349b1d10-0b06-4025-80fd-4861bd487a43">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="serial">504acd93-cd55-496e-a85f-30e811f827d4</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="uuid">504acd93-cd55-496e-a85f-30e811f827d4</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/504acd93-cd55-496e-a85f-30e811f827d4_disk">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/504acd93-cd55-496e-a85f-30e811f827d4_disk.config">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:dd:f6:26"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <target dev="tap349b1d10-0b"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/console.log" append="off"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:42:23 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:42:23 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:42:23 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:42:23 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.226 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Preparing to wait for external event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.227 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.227 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.228 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.228 225859 DEBUG nova.virt.libvirt.vif [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1822690739',display_name='tempest-ListServerFiltersTestJSON-instance-1822690739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1822690739',id=75,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-c4vxbkd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:13Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=504acd93-cd55-496e-a85f-30e811f827d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.229 225859 DEBUG nova.network.os_vif_util [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.230 225859 DEBUG nova.network.os_vif_util [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.230 225859 DEBUG os_vif [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.231 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.231 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.232 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.236 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.236 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap349b1d10-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.237 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap349b1d10-0b, col_values=(('external_ids', {'iface-id': '349b1d10-0b06-4025-80fd-4861bd487a43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:f6:26', 'vm-uuid': '504acd93-cd55-496e-a85f-30e811f827d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:23 np0005588919 NetworkManager[49104]: <info>  [1768920143.2686] manager: (tap349b1d10-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.268 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.277 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.278 225859 INFO os_vif [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b')#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.351 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.352 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.352 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] No VIF found with MAC fa:16:3e:dd:f6:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.353 225859 INFO nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Using config drive#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.389 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.394 225859 DEBUG nova.network.neutron [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Successfully updated port: e2648ead-7162-4661-94e1-755faa8f1fd1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.430 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.583 225859 DEBUG nova.compute.manager [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-changed-e2648ead-7162-4661-94e1-755faa8f1fd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.584 225859 DEBUG nova.compute.manager [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing instance network info cache due to event network-changed-e2648ead-7162-4661-94e1-755faa8f1fd1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.584 225859 DEBUG oslo_concurrency.lockutils [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.686 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated VIF entry in instance network info cache for port 607e59a4-2a6b-424a-9413-be318079781e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.686 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.722 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.723 225859 DEBUG nova.compute.manager [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-changed-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.723 225859 DEBUG nova.compute.manager [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Refreshing instance network info cache due to event network-changed-349b1d10-0b06-4025-80fd-4861bd487a43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.724 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.724 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.724 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Refreshing network info cache for port 349b1d10-0b06-4025-80fd-4861bd487a43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.726 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:23 np0005588919 nova_compute[225855]: 2026-01-20 14:42:23.726 225859 DEBUG nova.network.neutron [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:24 np0005588919 podman[255319]: 2026-01-20 14:42:24.0927341 +0000 UTC m=+0.127362299 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:42:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:24.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.458 225859 INFO nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Creating config drive at /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.471 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewld6c31 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.561 225859 WARNING nova.network.neutron [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] fc21b99b-4e34-422c-be05-0a440009dac4 already exists in list: networks containing: ['fc21b99b-4e34-422c-be05-0a440009dac4']. ignoring it#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.615 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewld6c31" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.637 225859 DEBUG nova.storage.rbd_utils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] rbd image 504acd93-cd55-496e-a85f-30e811f827d4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.640 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config 504acd93-cd55-496e-a85f-30e811f827d4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.822 225859 DEBUG oslo_concurrency.processutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config 504acd93-cd55-496e-a85f-30e811f827d4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.823 225859 INFO nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Deleting local config drive /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4/disk.config because it was imported into RBD.#033[00m
Jan 20 09:42:24 np0005588919 NetworkManager[49104]: <info>  [1768920144.9144] manager: (tap349b1d10-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Jan 20 09:42:24 np0005588919 kernel: tap349b1d10-0b: entered promiscuous mode
Jan 20 09:42:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:24Z|00252|binding|INFO|Claiming lport 349b1d10-0b06-4025-80fd-4861bd487a43 for this chassis.
Jan 20 09:42:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:24Z|00253|binding|INFO|349b1d10-0b06-4025-80fd-4861bd487a43: Claiming fa:16:3e:dd:f6:26 10.100.0.12
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.942 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:f6:26 10.100.0.12'], port_security=['fa:16:3e:dd:f6:26 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '504acd93-cd55-496e-a85f-30e811f827d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b95747114ab4043b93a260387199c91', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f18b0222-78a5-4c37-8065-772dbe5c63e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80e2aa5b-ecb8-4e93-992f-baaef718dd34, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=349b1d10-0b06-4025-80fd-4861bd487a43) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.944 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 349b1d10-0b06-4025-80fd-4861bd487a43 in datapath b36e9cab-12c6-4a09-9aab-ef2679d875ba bound to our chassis#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.947 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b36e9cab-12c6-4a09-9aab-ef2679d875ba#033[00m
Jan 20 09:42:24 np0005588919 systemd-udevd[255397]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.962 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[44cfa2d4-bfce-49f9-8cd0-207943d2c51d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:24 np0005588919 systemd-machined[194361]: New machine qemu-32-instance-0000004b.
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.963 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb36e9cab-11 in ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.965 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb36e9cab-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.965 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41710b3a-3124-40df-abbb-8b9317aba798]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.967 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b82929e9-21b0-4eb2-bbba-2607c0392987]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:24 np0005588919 NetworkManager[49104]: <info>  [1768920144.9709] device (tap349b1d10-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:24 np0005588919 NetworkManager[49104]: <info>  [1768920144.9727] device (tap349b1d10-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:24.981 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3409dc-2daf-4b41-812c-484910523533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:24 np0005588919 systemd[1]: Started Virtual Machine qemu-32-instance-0000004b.
Jan 20 09:42:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:24Z|00254|binding|INFO|Setting lport 349b1d10-0b06-4025-80fd-4861bd487a43 ovn-installed in OVS
Jan 20 09:42:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:24Z|00255|binding|INFO|Setting lport 349b1d10-0b06-4025-80fd-4861bd487a43 up in Southbound
Jan 20 09:42:24 np0005588919 nova_compute[225855]: 2026-01-20 14:42:24.997 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.012 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[95c358e1-4236-4f1e-a76a-8da666ef1d24]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.047 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[953adeab-517d-41a3-8f4a-3dcd5654624f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.052 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5eca8228-8af4-4843-a94a-aa224381f014]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 NetworkManager[49104]: <info>  [1768920145.0552] manager: (tapb36e9cab-10): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.089 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ed53ac-c29d-453c-b5a9-b0f8edc0269d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.092 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[72e7d6d7-31d5-48d8-b051-c05d105ae2b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 NetworkManager[49104]: <info>  [1768920145.1139] device (tapb36e9cab-10): carrier: link connected
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.119 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7426154a-f993-427c-b1b5-57c2c90371fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.136 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42f0fdc4-7273-48c1-9010-9444864ef779]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb36e9cab-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:c2:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518404, 'reachable_time': 39895, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255431, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.150 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[68fdc8ed-6940-4f9e-96e3-f004dc7cfe2e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:c252'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518404, 'tstamp': 518404}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255432, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:25.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.165 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc7c049-5b7f-4f3f-80ad-8aa677265c65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb36e9cab-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:c2:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518404, 'reachable_time': 39895, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255433, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.196 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a30f05df-6ca1-41a9-8b4a-595983bfaf2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.244 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4166bb-5436-4523-8bc1-533ddeae11c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.245 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb36e9cab-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.245 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.246 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb36e9cab-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:25 np0005588919 kernel: tapb36e9cab-10: entered promiscuous mode
Jan 20 09:42:25 np0005588919 NetworkManager[49104]: <info>  [1768920145.2480] manager: (tapb36e9cab-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.251 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb36e9cab-10, col_values=(('external_ids', {'iface-id': '5dcae274-b8f4-440a-a3eb-5c1a5a044346'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:25 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:25Z|00256|binding|INFO|Releasing lport 5dcae274-b8f4-440a-a3eb-5c1a5a044346 from this chassis (sb_readonly=0)
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.253 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.254 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e4090cd3-7e84-4fad-aad4-3a961c2be2dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.255 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b36e9cab-12c6-4a09-9aab-ef2679d875ba
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b36e9cab-12c6-4a09-9aab-ef2679d875ba.pid.haproxy
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b36e9cab-12c6-4a09-9aab-ef2679d875ba
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:42:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:25.255 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'env', 'PROCESS_TAG=haproxy-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b36e9cab-12c6-4a09-9aab-ef2679d875ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.281 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.587 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920145.5864213, 504acd93-cd55-496e-a85f-30e811f827d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.587 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] VM Started (Lifecycle Event)#033[00m
Jan 20 09:42:25 np0005588919 podman[255508]: 2026-01-20 14:42:25.622121924 +0000 UTC m=+0.046303762 container create a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.630 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.634 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920145.5874617, 504acd93-cd55-496e-a85f-30e811f827d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.635 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.661 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.663 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:25 np0005588919 systemd[1]: Started libpod-conmon-a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805.scope.
Jan 20 09:42:25 np0005588919 nova_compute[225855]: 2026-01-20 14:42:25.693 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:25 np0005588919 podman[255508]: 2026-01-20 14:42:25.59868322 +0000 UTC m=+0.022865078 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:42:25 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:42:25 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd3ad48162a1b4dec3cf75f42139906849c8a6a6a6b10f13149b76909d80e15f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:42:25 np0005588919 podman[255508]: 2026-01-20 14:42:25.711693951 +0000 UTC m=+0.135875819 container init a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:42:25 np0005588919 podman[255508]: 2026-01-20 14:42:25.716903169 +0000 UTC m=+0.141084997 container start a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:42:25 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [NOTICE]   (255528) : New worker (255530) forked
Jan 20 09:42:25 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [NOTICE]   (255528) : Loading success.
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.001 225859 DEBUG nova.compute.manager [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.002 225859 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.002 225859 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.003 225859 DEBUG oslo_concurrency.lockutils [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.003 225859 DEBUG nova.compute.manager [req-8ef92bfb-02d7-498f-8926-44bb7d094b03 req-78a51f7d-ef54-4f32-9f98-3b92d1220ed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Processing event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.004 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.007 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920146.0074406, 504acd93-cd55-496e-a85f-30e811f827d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.007 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.009 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.013 225859 INFO nova.virt.libvirt.driver [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Instance spawned successfully.#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.014 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.030 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.040 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.044 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.045 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.046 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.046 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.047 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.047 225859 DEBUG nova.virt.libvirt.driver [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.084 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.141 225859 INFO nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Took 12.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.141 225859 DEBUG nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:26.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.246 225859 INFO nova.compute.manager [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Took 13.68 seconds to build instance.#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.277 225859 DEBUG oslo_concurrency.lockutils [None req-1e7cd38a-601c-4a17-b056-dc991ac616b5 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:26 np0005588919 nova_compute[225855]: 2026-01-20 14:42:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:27.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:27 np0005588919 nova_compute[225855]: 2026-01-20 14:42:27.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:27 np0005588919 nova_compute[225855]: 2026-01-20 14:42:27.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:42:27 np0005588919 nova_compute[225855]: 2026-01-20 14:42:27.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:42:27 np0005588919 nova_compute[225855]: 2026-01-20 14:42:27.751 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:28 np0005588919 nova_compute[225855]: 2026-01-20 14:42:28.021 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Updated VIF entry in instance network info cache for port 349b1d10-0b06-4025-80fd-4861bd487a43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:28 np0005588919 nova_compute[225855]: 2026-01-20 14:42:28.021 225859 DEBUG nova.network.neutron [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Updating instance_info_cache with network_info: [{"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:28 np0005588919 nova_compute[225855]: 2026-01-20 14:42:28.041 225859 DEBUG oslo_concurrency.lockutils [req-c34e0831-1ec5-43e2-b737-614d371dffee req-b9091455-90df-4eab-a890-3071ac3f3f9a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-504acd93-cd55-496e-a85f-30e811f827d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:28.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:28 np0005588919 nova_compute[225855]: 2026-01-20 14:42:28.317 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.154 225859 DEBUG nova.compute.manager [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.154 225859 DEBUG oslo_concurrency.lockutils [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.154 225859 DEBUG oslo_concurrency.lockutils [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.155 225859 DEBUG oslo_concurrency.lockutils [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.155 225859 DEBUG nova.compute.manager [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] No waiting events found dispatching network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:29 np0005588919 nova_compute[225855]: 2026-01-20 14:42:29.155 225859 WARNING nova.compute.manager [req-f86e1fb5-998e-4941-9910-6e1e0daf8400 req-2ef13c09-587b-49be-b889-adc74128c5e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received unexpected event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:42:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:29.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.004 225859 DEBUG nova.network.neutron [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.033 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.035 225859 DEBUG oslo_concurrency.lockutils [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.036 225859 DEBUG nova.network.neutron [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Refreshing network info cache for port e2648ead-7162-4661-94e1-755faa8f1fd1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.040 225859 DEBUG nova.virt.libvirt.vif [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.041 225859 DEBUG nova.network.os_vif_util [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.042 225859 DEBUG nova.network.os_vif_util [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.043 225859 DEBUG os_vif [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.045 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.046 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.051 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape2648ead-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.053 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape2648ead-71, col_values=(('external_ids', {'iface-id': 'e2648ead-7162-4661-94e1-755faa8f1fd1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:35:48', 'vm-uuid': '10349dde-fb60-48ba-bc7b-42180c5eb49e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 NetworkManager[49104]: <info>  [1768920150.0583] manager: (tape2648ead-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.064 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.070 225859 INFO os_vif [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71')#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.071 225859 DEBUG nova.virt.libvirt.vif [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.072 225859 DEBUG nova.network.os_vif_util [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.073 225859 DEBUG nova.network.os_vif_util [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.076 225859 DEBUG nova.virt.libvirt.guest [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] attach device xml: <interface type="ethernet">
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:80:35:48"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <target dev="tape2648ead-71"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:42:30 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:42:30 np0005588919 kernel: tape2648ead-71: entered promiscuous mode
Jan 20 09:42:30 np0005588919 NetworkManager[49104]: <info>  [1768920150.0917] manager: (tape2648ead-71): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Jan 20 09:42:30 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:30Z|00257|binding|INFO|Claiming lport e2648ead-7162-4661-94e1-755faa8f1fd1 for this chassis.
Jan 20 09:42:30 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:30Z|00258|binding|INFO|e2648ead-7162-4661-94e1-755faa8f1fd1: Claiming fa:16:3e:80:35:48 10.100.0.6
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.097 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.112 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:35:48 10.100.0.6'], port_security=['fa:16:3e:80:35:48 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-381806613', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '10349dde-fb60-48ba-bc7b-42180c5eb49e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-381806613', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '7', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e2648ead-7162-4661-94e1-755faa8f1fd1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.115 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e2648ead-7162-4661-94e1-755faa8f1fd1 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 bound to our chassis#033[00m
Jan 20 09:42:30 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:30Z|00259|binding|INFO|Setting lport e2648ead-7162-4661-94e1-755faa8f1fd1 ovn-installed in OVS
Jan 20 09:42:30 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:30Z|00260|binding|INFO|Setting lport e2648ead-7162-4661-94e1-755faa8f1fd1 up in Southbound
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.117 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 systemd-udevd[255597]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:30 np0005588919 NetworkManager[49104]: <info>  [1768920150.1435] device (tape2648ead-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:30 np0005588919 NetworkManager[49104]: <info>  [1768920150.1440] device (tape2648ead-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.134 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6255aac0-d890-42e8-b904-c1eafb0847ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:30.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.175 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[926dc833-8f57-4f9f-a219-dbf1f772291d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.180 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[22b3c6c9-a675-445b-8b75-64d333231303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.214 225859 DEBUG nova.virt.libvirt.driver [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.214 225859 DEBUG nova.virt.libvirt.driver [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.214 225859 DEBUG nova.virt.libvirt.driver [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:a3:13:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.215 225859 DEBUG nova.virt.libvirt.driver [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] No VIF found with MAC fa:16:3e:80:35:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.217 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[44046935-565c-4277-a31f-d27d3dfa574e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.243 225859 DEBUG nova.virt.libvirt.guest [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:name>tempest-tempest.common.compute-instance-689127716</nova:name>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:42:30</nova:creationTime>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:port uuid="607e59a4-2a6b-424a-9413-be318079781e">
Jan 20 09:42:30 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    <nova:port uuid="e2648ead-7162-4661-94e1-755faa8f1fd1">
Jan 20 09:42:30 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:30 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:42:30 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:42:30 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.244 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8cd11d-33a8-41d6-b964-e3461f97c96c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511911, 'reachable_time': 34363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255605, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.263 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[47e5bd7c-66f6-4c8c-81e6-0bdac684dccd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511920, 'tstamp': 511920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255606, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511922, 'tstamp': 511922}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255606, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.274 225859 DEBUG oslo_concurrency.lockutils [None req-812cb0b8-b728-4175-8739-e1b0cc33e188 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.276 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.280 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.280 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.280 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:30.281 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.898 225859 DEBUG nova.compute.manager [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.899 225859 DEBUG oslo_concurrency.lockutils [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.899 225859 DEBUG oslo_concurrency.lockutils [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.899 225859 DEBUG oslo_concurrency.lockutils [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.899 225859 DEBUG nova.compute.manager [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] No waiting events found dispatching network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:30 np0005588919 nova_compute[225855]: 2026-01-20 14:42:30.900 225859 WARNING nova.compute.manager [req-5f88eef7-6460-4dc5-bd7b-46d5ecf31bb2 req-d2fb87aa-e2cb-49e6-9ace-5f878a893196 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received unexpected event network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.023 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:31.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.715 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.716 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.747 225859 DEBUG nova.objects.instance [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'flavor' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.820 225859 DEBUG nova.virt.libvirt.vif [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.821 225859 DEBUG nova.network.os_vif_util [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.821 225859 DEBUG nova.network.os_vif_util [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.824 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.826 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.828 225859 DEBUG nova.virt.libvirt.driver [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Attempting to detach device tape2648ead-71 from instance 10349dde-fb60-48ba-bc7b-42180c5eb49e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.828 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:80:35:48"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <target dev="tape2648ead-71"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.833 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.836 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <name>instance-00000047</name>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <uuid>10349dde-fb60-48ba-bc7b-42180c5eb49e</uuid>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:name>tempest-tempest.common.compute-instance-689127716</nova:name>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:42:30</nova:creationTime>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:port uuid="607e59a4-2a6b-424a-9413-be318079781e">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:port uuid="e2648ead-7162-4661-94e1-755faa8f1fd1">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <resource>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </resource>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='serial'>10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='uuid'>10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk' index='2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config' index='1'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:a3:13:21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='tap607e59a4-2a'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:80:35:48'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='tape2648ead-71'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='net1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log' append='off'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log' append='off'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c240,c925</label>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c240,c925</imagelabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.838 225859 INFO nova.virt.libvirt.driver [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tape2648ead-71 from instance 10349dde-fb60-48ba-bc7b-42180c5eb49e from the persistent domain config.#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.838 225859 DEBUG nova.virt.libvirt.driver [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] (1/8): Attempting to detach device tape2648ead-71 with device alias net1 from instance 10349dde-fb60-48ba-bc7b-42180c5eb49e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.838 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] detach device xml: <interface type="ethernet">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:80:35:48"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <target dev="tape2648ead-71"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </interface>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:42:31 np0005588919 kernel: tape2648ead-71 (unregistering): left promiscuous mode
Jan 20 09:42:31 np0005588919 NetworkManager[49104]: <info>  [1768920151.8968] device (tape2648ead-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.902 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768920151.9018826, 10349dde-fb60-48ba-bc7b-42180c5eb49e => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.904 225859 DEBUG nova.virt.libvirt.driver [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Start waiting for the detach event from libvirt for device tape2648ead-71 with device alias net1 for instance 10349dde-fb60-48ba-bc7b-42180c5eb49e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.905 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.906 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:31Z|00261|binding|INFO|Releasing lport e2648ead-7162-4661-94e1-755faa8f1fd1 from this chassis (sb_readonly=0)
Jan 20 09:42:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:31Z|00262|binding|INFO|Setting lport e2648ead-7162-4661-94e1-755faa8f1fd1 down in Southbound
Jan 20 09:42:31 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:31Z|00263|binding|INFO|Removing iface tape2648ead-71 ovn-installed in OVS
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.909 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.910 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:80:35:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape2648ead-71"/></interface>not found in domain: <domain type='kvm' id='31'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <name>instance-00000047</name>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <uuid>10349dde-fb60-48ba-bc7b-42180c5eb49e</uuid>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:name>tempest-tempest.common.compute-instance-689127716</nova:name>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:42:30</nova:creationTime>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:port uuid="607e59a4-2a6b-424a-9413-be318079781e">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:port uuid="e2648ead-7162-4661-94e1-755faa8f1fd1">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <resource>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </resource>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='serial'>10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='uuid'>10349dde-fb60-48ba-bc7b-42180c5eb49e</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk' index='2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/10349dde-fb60-48ba-bc7b-42180c5eb49e_disk.config' index='1'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </controller>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:a3:13:21'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target dev='tap607e59a4-2a'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log' append='off'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      </target>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e/console.log' append='off'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </console>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </input>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c240,c925</label>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c240,c925</imagelabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.910 225859 INFO nova.virt.libvirt.driver [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully detached device tape2648ead-71 from instance 10349dde-fb60-48ba-bc7b-42180c5eb49e from the live domain config.#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.911 225859 DEBUG nova.virt.libvirt.vif [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.911 225859 DEBUG nova.network.os_vif_util [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.911 225859 DEBUG nova.network.os_vif_util [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.912 225859 DEBUG os_vif [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.916 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.916 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2648ead-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.920 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.919 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:35:48 10.100.0.6'], port_security=['fa:16:3e:80:35:48 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-381806613', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '10349dde-fb60-48ba-bc7b-42180c5eb49e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-381806613', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '9', 'neutron:security_group_ids': '52cb9fb4-4318-4f53-9b5a-002d95792517', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e2648ead-7162-4661-94e1-755faa8f1fd1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.921 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e2648ead-7162-4661-94e1-755faa8f1fd1 in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.922 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc21b99b-4e34-422c-be05-0a440009dac4#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.930 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.932 225859 INFO os_vif [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71')#033[00m
Jan 20 09:42:31 np0005588919 nova_compute[225855]: 2026-01-20 14:42:31.933 225859 DEBUG nova.virt.libvirt.guest [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:name>tempest-tempest.common.compute-instance-689127716</nova:name>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 14:42:31</nova:creationTime>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:user uuid="c8a9fb458d27434495a77a94827b6097">tempest-AttachInterfacesTestJSON-305746947-project-member</nova:user>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:project uuid="e3f93fd4b2154dda9f38e62334904303">tempest-AttachInterfacesTestJSON-305746947</nova:project>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    <nova:port uuid="607e59a4-2a6b-424a-9413-be318079781e">
Jan 20 09:42:31 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 09:42:31 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 09:42:31 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.944 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1238868d-31b4-4764-9555-fd2b18450735]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.974 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ad259a30-3f36-402b-a920-471b12d47ae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:31.976 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe1bf07-fe76-4103-beda-07468e9659cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.002 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[dff4e8fe-277a-4920-8db0-21a56709d069]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.017 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51a69b6a-e7f3-46a1-a52e-b8ac8997533e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc21b99b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:5b:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511911, 'reachable_time': 34363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255617, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.032 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d384e570-e3a1-4038-b6d6-5a2d1506620c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511920, 'tstamp': 511920}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255618, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfc21b99b-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511922, 'tstamp': 511922}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255618, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.033 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588919 nova_compute[225855]: 2026-01-20 14:42:32.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588919 nova_compute[225855]: 2026-01-20 14:42:32.036 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.036 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc21b99b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.036 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.037 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc21b99b-40, col_values=(('external_ids', {'iface-id': '583df905-1d9f-49c1-b209-4b7fad1599f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:32.037 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:32.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:33 np0005588919 podman[255619]: 2026-01-20 14:42:33.021806385 +0000 UTC m=+0.060126354 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.106 225859 DEBUG nova.compute.manager [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.106 225859 DEBUG oslo_concurrency.lockutils [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.106 225859 DEBUG oslo_concurrency.lockutils [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.107 225859 DEBUG oslo_concurrency.lockutils [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.107 225859 DEBUG nova.compute.manager [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] No waiting events found dispatching network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.107 225859 WARNING nova.compute.manager [req-0e2feea7-4ae8-409d-ac26-f1bf884f8daf req-f5ef4ac8-cbd1-4279-b4c0-f40aa6d8704e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received unexpected event network-vif-plugged-e2648ead-7162-4661-94e1-755faa8f1fd1 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:42:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:33.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.554 225859 DEBUG nova.network.neutron [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated VIF entry in instance network info cache for port e2648ead-7162-4661-94e1-755faa8f1fd1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.554 225859 DEBUG nova.network.neutron [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.643 225859 DEBUG oslo_concurrency.lockutils [req-928ae490-427b-4f76-924a-e4181bf9d70a req-c68095fc-99a6-4de7-939c-cba34d8736e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.644 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.644 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:42:33 np0005588919 nova_compute[225855]: 2026-01-20 14:42:33.644 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:34.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:34 np0005588919 nova_compute[225855]: 2026-01-20 14:42:34.604 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:35.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.315 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.316 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.316 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.316 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.317 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.318 225859 INFO nova.compute.manager [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Terminating instance#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.319 225859 DEBUG nova.compute.manager [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:42:35 np0005588919 kernel: tap607e59a4-2a (unregistering): left promiscuous mode
Jan 20 09:42:35 np0005588919 NetworkManager[49104]: <info>  [1768920155.4351] device (tap607e59a4-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.450 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:35Z|00264|binding|INFO|Releasing lport 607e59a4-2a6b-424a-9413-be318079781e from this chassis (sb_readonly=0)
Jan 20 09:42:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:35Z|00265|binding|INFO|Setting lport 607e59a4-2a6b-424a-9413-be318079781e down in Southbound
Jan 20 09:42:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:35Z|00266|binding|INFO|Removing iface tap607e59a4-2a ovn-installed in OVS
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.453 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.467 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:13:21 10.100.0.11'], port_security=['fa:16:3e:a3:13:21 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '10349dde-fb60-48ba-bc7b-42180c5eb49e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc21b99b-4e34-422c-be05-0a440009dac4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3f93fd4b2154dda9f38e62334904303', 'neutron:revision_number': '4', 'neutron:security_group_ids': '52b08fd6-6aa8-4470-b89c-ece04e1c959e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7af6b6bc-3cbd-48be-9f10-23ec011e0426, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=607e59a4-2a6b-424a-9413-be318079781e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.468 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 607e59a4-2a6b-424a-9413-be318079781e in datapath fc21b99b-4e34-422c-be05-0a440009dac4 unbound from our chassis#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.471 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc21b99b-4e34-422c-be05-0a440009dac4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.472 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1c0c1e-1af2-45a9-8e3a-c9b190c50528]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.472 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 namespace which is not needed anymore#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.474 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000047.scope: Deactivated successfully.
Jan 20 09:42:35 np0005588919 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000047.scope: Consumed 15.809s CPU time.
Jan 20 09:42:35 np0005588919 systemd-machined[194361]: Machine qemu-31-instance-00000047 terminated.
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.555 225859 INFO nova.virt.libvirt.driver [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Instance destroyed successfully.#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.556 225859 DEBUG nova.objects.instance [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lazy-loading 'resources' on Instance uuid 10349dde-fb60-48ba-bc7b-42180c5eb49e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.577 225859 DEBUG nova.virt.libvirt.vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.578 225859 DEBUG nova.network.os_vif_util [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.579 225859 DEBUG nova.network.os_vif_util [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.579 225859 DEBUG os_vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.583 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap607e59a4-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.584 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.587 225859 INFO os_vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:13:21,bridge_name='br-int',has_traffic_filtering=True,id=607e59a4-2a6b-424a-9413-be318079781e,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap607e59a4-2a')#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.588 225859 DEBUG nova.virt.libvirt.vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:41:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-689127716',display_name='tempest-tempest.common.compute-instance-689127716',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-689127716',id=71,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDk9SkW5N7MhrGaZslG18EJ7xoBof9PQa4upjUw+XxfbO5rNOjJYMJtJMRGPfgbl1pwAZZD7LHjNNMRFKVo+T+C8Rnr+HXWsPYQmvPGwjjZ++NXvRdqES1LIbRDiwaFMJQ==',key_name='tempest-keypair-1970360297',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:41:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3f93fd4b2154dda9f38e62334904303',ramdisk_id='',reservation_id='r-2p3ovedc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-305746947',owner_user_name='tempest-AttachInterfacesTestJSON-305746947-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:41:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c8a9fb458d27434495a77a94827b6097',uuid=10349dde-fb60-48ba-bc7b-42180c5eb49e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.588 225859 DEBUG nova.network.os_vif_util [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converting VIF {"id": "e2648ead-7162-4661-94e1-755faa8f1fd1", "address": "fa:16:3e:80:35:48", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2648ead-71", "ovs_interfaceid": "e2648ead-7162-4661-94e1-755faa8f1fd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.589 225859 DEBUG nova.network.os_vif_util [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.590 225859 DEBUG os_vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.591 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.591 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2648ead-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.591 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.594 225859 INFO os_vif [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:35:48,bridge_name='br-int',has_traffic_filtering=True,id=e2648ead-7162-4661-94e1-755faa8f1fd1,network=Network(fc21b99b-4e34-422c-be05-0a440009dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape2648ead-71')#033[00m
Jan 20 09:42:35 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [NOTICE]   (254084) : haproxy version is 2.8.14-c23fe91
Jan 20 09:42:35 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [NOTICE]   (254084) : path to executable is /usr/sbin/haproxy
Jan 20 09:42:35 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [ALERT]    (254084) : Current worker (254086) exited with code 143 (Terminated)
Jan 20 09:42:35 np0005588919 neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4[254080]: [WARNING]  (254084) : All workers exited. Exiting... (0)
Jan 20 09:42:35 np0005588919 systemd[1]: libpod-2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906.scope: Deactivated successfully.
Jan 20 09:42:35 np0005588919 podman[255671]: 2026-01-20 14:42:35.630447165 +0000 UTC m=+0.055489932 container died 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:42:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906-userdata-shm.mount: Deactivated successfully.
Jan 20 09:42:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay-6b359ef85c164591e81dbb650ad14fe7eb3bd9cd95784fcc030b2336836287bd-merged.mount: Deactivated successfully.
Jan 20 09:42:35 np0005588919 podman[255671]: 2026-01-20 14:42:35.67474211 +0000 UTC m=+0.099784887 container cleanup 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 09:42:35 np0005588919 systemd[1]: libpod-conmon-2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906.scope: Deactivated successfully.
Jan 20 09:42:35 np0005588919 podman[255721]: 2026-01-20 14:42:35.736940551 +0000 UTC m=+0.039828489 container remove 2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.742 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e9bf0ae6-ebbd-4ee7-ab38-ba5ea2896033]: (4, ('Tue Jan 20 02:42:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906)\n2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906\nTue Jan 20 02:42:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 (2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906)\n2a35028076fb1ea020c7e21b3e0194e0459a116bbf5b4723a5ca26a751673906\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.744 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e92cd75a-5f82-443a-b23c-11039b74ae5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.745 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc21b99b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 kernel: tapfc21b99b-40: left promiscuous mode
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78523d57-c043-4e4c-9161-fa85018cfeff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 nova_compute[225855]: 2026-01-20 14:42:35.767 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.769 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a07d00af-d815-4da2-919b-b33e77f5bc71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.770 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[842d9032-911c-4366-ba44-623a68498d63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.786 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2c256a-27e6-4762-867d-25552b1086e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511904, 'reachable_time': 44918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255736, 'error': None, 'target': 'ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:35 np0005588919 systemd[1]: run-netns-ovnmeta\x2dfc21b99b\x2d4e34\x2d422c\x2dbe05\x2d0a440009dac4.mount: Deactivated successfully.
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.788 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc21b99b-4e34-422c-be05-0a440009dac4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:42:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:35.789 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0f0817-782a-4216-8ab5-7e79c3e238d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:36.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:36 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 20 09:42:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.421 225859 INFO nova.virt.libvirt.driver [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Deleting instance files /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e_del#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.422 225859 INFO nova.virt.libvirt.driver [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Deletion of /var/lib/nova/instances/10349dde-fb60-48ba-bc7b-42180c5eb49e_del complete#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.519 225859 INFO nova.compute.manager [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Took 1.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.519 225859 DEBUG oslo.service.loopingcall [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.520 225859 DEBUG nova.compute.manager [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.520 225859 DEBUG nova.network.neutron [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.548 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.765 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.766 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.767 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquired lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.768 225859 DEBUG nova.network.neutron [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.770 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.771 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.771 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.772 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.772 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.773 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.807 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.808 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.808 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.808 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:42:36 np0005588919 nova_compute[225855]: 2026-01-20 14:42:36.809 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:37.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2052746815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.277 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.357 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.357 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000004b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.430 225859 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-unplugged-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.430 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.431 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.431 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.431 225859 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] No waiting events found dispatching network-vif-unplugged-607e59a4-2a6b-424a-9413-be318079781e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.431 225859 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-unplugged-607e59a4-2a6b-424a-9413-be318079781e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.432 225859 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.432 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.432 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.433 225859 DEBUG oslo_concurrency.lockutils [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.433 225859 DEBUG nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] No waiting events found dispatching network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.433 225859 WARNING nova.compute.manager [req-a3662387-f8e9-4fa3-89e9-929adcf31f49 req-b55b30d8-0d99-42dd-8a67-643e4351944b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received unexpected event network-vif-plugged-607e59a4-2a6b-424a-9413-be318079781e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:42:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:37.515 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.515 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:37.516 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.555 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.557 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4310MB free_disk=20.81363296508789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.557 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.557 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.738 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 10349dde-fb60-48ba-bc7b-42180c5eb49e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.738 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 504acd93-cd55-496e-a85f-30e811f827d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.739 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.739 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:42:37 np0005588919 nova_compute[225855]: 2026-01-20 14:42:37.842 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:38.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2781861380' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:38 np0005588919 nova_compute[225855]: 2026-01-20 14:42:38.285 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:38 np0005588919 nova_compute[225855]: 2026-01-20 14:42:38.290 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:38 np0005588919 nova_compute[225855]: 2026-01-20 14:42:38.305 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:38 np0005588919 nova_compute[225855]: 2026-01-20 14:42:38.349 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:42:38 np0005588919 nova_compute[225855]: 2026-01-20 14:42:38.350 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:39.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:39 np0005588919 nova_compute[225855]: 2026-01-20 14:42:39.554 225859 DEBUG nova.network.neutron [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [{"id": "607e59a4-2a6b-424a-9413-be318079781e", "address": "fa:16:3e:a3:13:21", "network": {"id": "fc21b99b-4e34-422c-be05-0a440009dac4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-808285772-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3f93fd4b2154dda9f38e62334904303", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap607e59a4-2a", "ovs_interfaceid": "607e59a4-2a6b-424a-9413-be318079781e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:39 np0005588919 nova_compute[225855]: 2026-01-20 14:42:39.584 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Releasing lock "refresh_cache-10349dde-fb60-48ba-bc7b-42180c5eb49e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:39Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:f6:26 10.100.0.12
Jan 20 09:42:39 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:39Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:f6:26 10.100.0.12
Jan 20 09:42:39 np0005588919 nova_compute[225855]: 2026-01-20 14:42:39.615 225859 DEBUG oslo_concurrency.lockutils [None req-c5c1a956-f98e-4e92-8ffb-c6a46dd06190 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "interface-10349dde-fb60-48ba-bc7b-42180c5eb49e-e2648ead-7162-4661-94e1-755faa8f1fd1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:40.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.170 225859 DEBUG nova.network.neutron [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.217 225859 INFO nova.compute.manager [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Took 3.70 seconds to deallocate network for instance.#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.315 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.316 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.423 225859 DEBUG oslo_concurrency.processutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2242138520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.884 225859 DEBUG oslo_concurrency.processutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.889 225859 DEBUG nova.compute.provider_tree [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.930 225859 DEBUG nova.scheduler.client.report [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:40 np0005588919 nova_compute[225855]: 2026-01-20 14:42:40.960 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:41 np0005588919 nova_compute[225855]: 2026-01-20 14:42:41.000 225859 INFO nova.scheduler.client.report [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Deleted allocations for instance 10349dde-fb60-48ba-bc7b-42180c5eb49e#033[00m
Jan 20 09:42:41 np0005588919 nova_compute[225855]: 2026-01-20 14:42:41.140 225859 DEBUG oslo_concurrency.lockutils [None req-2e03531a-2dfa-41ad-89c5-2d64e98b64c5 c8a9fb458d27434495a77a94827b6097 e3f93fd4b2154dda9f38e62334904303 - - default default] Lock "10349dde-fb60-48ba-bc7b-42180c5eb49e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:42.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:42 np0005588919 nova_compute[225855]: 2026-01-20 14:42:42.302 225859 DEBUG nova.compute.manager [req-a4c423ee-3814-4a0b-a36d-809c06a65fdc req-092a1cbe-9970-4a76-8f50-f9bc63f8cd52 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Received event network-vif-deleted-607e59a4-2a6b-424a-9413-be318079781e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:44.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:44 np0005588919 nova_compute[225855]: 2026-01-20 14:42:44.345 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:44 np0005588919 nova_compute[225855]: 2026-01-20 14:42:44.345 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:44.519 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:45.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.291 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.377 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.378 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.398 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.468 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.469 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.475 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.475 225859 INFO nova.compute.claims [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.587 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:45 np0005588919 nova_compute[225855]: 2026-01-20 14:42:45.611 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/265236336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.113 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.117 225859 DEBUG nova.compute.provider_tree [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.143 225859 DEBUG nova.scheduler.client.report [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:46.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.199 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.200 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.266 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.266 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.290 225859 INFO nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.318 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:42:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.506 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.508 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.508 225859 INFO nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Creating image(s)#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.531 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.557 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.583 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.587 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.635 225859 DEBUG nova.policy [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa2e7857e85f483eb0d162e2ee8c2e2c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3e022a35f604df2bbc885e498b1e206', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.650 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.651 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.652 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.652 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.674 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:46 np0005588919 nova_compute[225855]: 2026-01-20 14:42:46.677 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.006 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.329s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.073 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] resizing rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:42:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:42:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:47.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.216 225859 DEBUG nova.objects.instance [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'migration_context' on Instance uuid 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.231 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.231 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Ensure instance console log exists: /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.232 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.232 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:47 np0005588919 nova_compute[225855]: 2026-01-20 14:42:47.232 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:48.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:48 np0005588919 nova_compute[225855]: 2026-01-20 14:42:48.513 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Successfully created port: d70a594c-be8a-461a-93b0-7416d3587e74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:42:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:49.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:42:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:42:49 np0005588919 nova_compute[225855]: 2026-01-20 14:42:49.880 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Successfully updated port: d70a594c-be8a-461a-93b0-7416d3587e74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:42:49 np0005588919 nova_compute[225855]: 2026-01-20 14:42:49.911 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:49 np0005588919 nova_compute[225855]: 2026-01-20 14:42:49.912 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquired lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:49 np0005588919 nova_compute[225855]: 2026-01-20 14:42:49.912 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.092 225859 DEBUG nova.compute.manager [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-changed-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.093 225859 DEBUG nova.compute.manager [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Refreshing instance network info cache due to event network-changed-d70a594c-be8a-461a-93b0-7416d3587e74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.093 225859 DEBUG oslo_concurrency.lockutils [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.168 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:42:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:50.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.294 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.552 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920155.5507116, 10349dde-fb60-48ba-bc7b-42180c5eb49e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.553 225859 INFO nova.compute.manager [-] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.586 225859 DEBUG nova.compute.manager [None req-34b55a62-f842-43c3-805d-228af6a25266 - - - - - -] [instance: 10349dde-fb60-48ba-bc7b-42180c5eb49e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:50 np0005588919 nova_compute[225855]: 2026-01-20 14:42:50.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:51.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.664 225859 DEBUG nova.network.neutron [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Updating instance_info_cache with network_info: [{"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.743 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Releasing lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.743 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Instance network_info: |[{"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.744 225859 DEBUG oslo_concurrency.lockutils [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.744 225859 DEBUG nova.network.neutron [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Refreshing network info cache for port d70a594c-be8a-461a-93b0-7416d3587e74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.750 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Start _get_guest_xml network_info=[{"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.757 225859 WARNING nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.765 225859 DEBUG nova.virt.libvirt.host [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.766 225859 DEBUG nova.virt.libvirt.host [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.770 225859 DEBUG nova.virt.libvirt.host [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.771 225859 DEBUG nova.virt.libvirt.host [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.774 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.774 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.775 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.775 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.775 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.775 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.776 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.776 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.776 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.776 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.777 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.777 225859 DEBUG nova.virt.hardware [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:42:51 np0005588919 nova_compute[225855]: 2026-01-20 14:42:51.780 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:52.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3459419744' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.219 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.254 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.259 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:42:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1767237631' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.701 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.702 225859 DEBUG nova.virt.libvirt.vif [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1574727229',display_name='tempest-tempest.common.compute-instance-1574727229-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1574727229-1',id=77,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-6khz0z7o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:46Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=69cc4cf9-dfe3-44cb-b811-0300b5ccd66b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.703 225859 DEBUG nova.network.os_vif_util [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.704 225859 DEBUG nova.network.os_vif_util [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.705 225859 DEBUG nova.objects.instance [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'pci_devices' on Instance uuid 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.726 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <uuid>69cc4cf9-dfe3-44cb-b811-0300b5ccd66b</uuid>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <name>instance-0000004d</name>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:name>tempest-tempest.common.compute-instance-1574727229-1</nova:name>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:42:51</nova:creationTime>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:user uuid="aa2e7857e85f483eb0d162e2ee8c2e2c">tempest-MultipleCreateTestJSON-164394330-project-member</nova:user>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:project uuid="a3e022a35f604df2bbc885e498b1e206">tempest-MultipleCreateTestJSON-164394330</nova:project>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <nova:port uuid="d70a594c-be8a-461a-93b0-7416d3587e74">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="serial">69cc4cf9-dfe3-44cb-b811-0300b5ccd66b</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="uuid">69cc4cf9-dfe3-44cb-b811-0300b5ccd66b</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:11:3a:34"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <target dev="tapd70a594c-be"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/console.log" append="off"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:42:52 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:42:52 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:42:52 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:42:52 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.727 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Preparing to wait for external event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.727 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.728 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.728 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.728 225859 DEBUG nova.virt.libvirt.vif [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:42:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1574727229',display_name='tempest-tempest.common.compute-instance-1574727229-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1574727229-1',id=77,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-6khz0z7o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:42:46Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=69cc4cf9-dfe3-44cb-b811-0300b5ccd66b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.729 225859 DEBUG nova.network.os_vif_util [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.729 225859 DEBUG nova.network.os_vif_util [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.730 225859 DEBUG os_vif [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.730 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.731 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.731 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.733 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd70a594c-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.734 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd70a594c-be, col_values=(('external_ids', {'iface-id': 'd70a594c-be8a-461a-93b0-7416d3587e74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:3a:34', 'vm-uuid': '69cc4cf9-dfe3-44cb-b811-0300b5ccd66b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.736 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:52 np0005588919 NetworkManager[49104]: <info>  [1768920172.7371] manager: (tapd70a594c-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.739 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.743 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.743 225859 INFO os_vif [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be')#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.811 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.811 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.811 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No VIF found with MAC fa:16:3e:11:3a:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.812 225859 INFO nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Using config drive#033[00m
Jan 20 09:42:52 np0005588919 nova_compute[225855]: 2026-01-20 14:42:52.836 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:53.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.374 225859 INFO nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Creating config drive at /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.383 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph9cudons execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.524 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph9cudons" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.553 225859 DEBUG nova.storage.rbd_utils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.557 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.721 225859 DEBUG oslo_concurrency.processutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.722 225859 INFO nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Deleting local config drive /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b/disk.config because it was imported into RBD.#033[00m
Jan 20 09:42:53 np0005588919 kernel: tapd70a594c-be: entered promiscuous mode
Jan 20 09:42:53 np0005588919 NetworkManager[49104]: <info>  [1768920173.7778] manager: (tapd70a594c-be): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 20 09:42:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:53Z|00267|binding|INFO|Claiming lport d70a594c-be8a-461a-93b0-7416d3587e74 for this chassis.
Jan 20 09:42:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:53Z|00268|binding|INFO|d70a594c-be8a-461a-93b0-7416d3587e74: Claiming fa:16:3e:11:3a:34 10.100.0.12
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:53Z|00269|binding|INFO|Setting lport d70a594c-be8a-461a-93b0-7416d3587e74 ovn-installed in OVS
Jan 20 09:42:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:53Z|00270|binding|INFO|Setting lport d70a594c-be8a-461a-93b0-7416d3587e74 up in Southbound
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.809 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:3a:34 10.100.0.12'], port_security=['fa:16:3e:11:3a:34 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '69cc4cf9-dfe3-44cb-b811-0300b5ccd66b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3e022a35f604df2bbc885e498b1e206', 'neutron:revision_number': '2', 'neutron:security_group_ids': '885819b7-5060-4b73-ad54-3f31f821195c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89e607b1-9e39-47f0-8180-8aaef3a2a0e9, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d70a594c-be8a-461a-93b0-7416d3587e74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:53 np0005588919 nova_compute[225855]: 2026-01-20 14:42:53.810 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.810 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d70a594c-be8a-461a-93b0-7416d3587e74 in datapath 3e260ad9-fcf1-432b-b71b-b943d4249b65 bound to our chassis#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.811 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e260ad9-fcf1-432b-b71b-b943d4249b65#033[00m
Jan 20 09:42:53 np0005588919 systemd-udevd[256314]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.825 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14b97277-42b2-4425-8fd8-7f68f58c5b0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.827 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e260ad9-f1 in ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.828 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e260ad9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.828 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5c66162b-8d24-45a9-97eb-7942c898e52e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.829 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[16b9b360-b5ce-45c5-ba24-3fda2203e267]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 NetworkManager[49104]: <info>  [1768920173.8328] device (tapd70a594c-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:42:53 np0005588919 systemd-machined[194361]: New machine qemu-33-instance-0000004d.
Jan 20 09:42:53 np0005588919 NetworkManager[49104]: <info>  [1768920173.8347] device (tapd70a594c-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.839 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd10506-013d-4665-b01d-a328023ee07c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 systemd[1]: Started Virtual Machine qemu-33-instance-0000004d.
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.852 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14703686-db6a-49d7-94ae-105369abaffd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.881 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d9eb439e-8097-4cba-bba1-30b600c8ef54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.887 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3dec57-dc88-4df9-9144-ed85dfb19d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 systemd-udevd[256321]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:42:53 np0005588919 NetworkManager[49104]: <info>  [1768920173.8885] manager: (tap3e260ad9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.926 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[74aba923-4f67-4ed4-9174-e97e8baa8d45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.928 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad2c12b-dded-408e-a2c7-69aa17e28223]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 NetworkManager[49104]: <info>  [1768920173.9524] device (tap3e260ad9-f0): carrier: link connected
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.964 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cb57dc79-0948-49c4-8bd1-84446caeb547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.981 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4712d4-40a7-4e45-932e-6d0d0559af39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e260ad9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:13:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521287, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256350, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:53.997 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19f78edd-c3fd-42db-af6a-79a4701d146d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:134a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521287, 'tstamp': 521287}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256351, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.014 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cfa5b3-d0c2-452f-adc6-fb40159b2146]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e260ad9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:13:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521287, 'reachable_time': 19914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256352, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.054 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[601e4da3-a3f6-4a6e-a167-d359a70e5112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.120 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1801b3cc-a888-4cb6-8570-20745b887865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.121 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e260ad9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.121 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.121 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e260ad9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:54 np0005588919 kernel: tap3e260ad9-f0: entered promiscuous mode
Jan 20 09:42:54 np0005588919 NetworkManager[49104]: <info>  [1768920174.1241] manager: (tap3e260ad9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.125 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e260ad9-f0, col_values=(('external_ids', {'iface-id': '2b7c295d-f074-4cfb-aca0-08946126ddbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:54Z|00271|binding|INFO|Releasing lport 2b7c295d-f074-4cfb-aca0-08946126ddbc from this chassis (sb_readonly=0)
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.140 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.140 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.141 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.142 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dc692e22-5f5d-4e0a-a934-2a1d23cbf78d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.142 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3e260ad9-fcf1-432b-b71b-b943d4249b65
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3e260ad9-fcf1-432b-b71b-b943d4249b65
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:42:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:54.143 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'env', 'PROCESS_TAG=haproxy-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e260ad9-fcf1-432b-b71b-b943d4249b65.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:42:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:54.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:54 np0005588919 podman[256384]: 2026-01-20 14:42:54.480469763 +0000 UTC m=+0.029389723 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.713 225859 DEBUG nova.compute.manager [req-68531ff8-9d58-423b-8498-14cd354b4f74 req-6000777a-97d7-4509-b5a3-0682e61fe1e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.714 225859 DEBUG oslo_concurrency.lockutils [req-68531ff8-9d58-423b-8498-14cd354b4f74 req-6000777a-97d7-4509-b5a3-0682e61fe1e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.714 225859 DEBUG oslo_concurrency.lockutils [req-68531ff8-9d58-423b-8498-14cd354b4f74 req-6000777a-97d7-4509-b5a3-0682e61fe1e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.715 225859 DEBUG oslo_concurrency.lockutils [req-68531ff8-9d58-423b-8498-14cd354b4f74 req-6000777a-97d7-4509-b5a3-0682e61fe1e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.715 225859 DEBUG nova.compute.manager [req-68531ff8-9d58-423b-8498-14cd354b4f74 req-6000777a-97d7-4509-b5a3-0682e61fe1e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Processing event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.922 225859 DEBUG nova.network.neutron [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Updated VIF entry in instance network info cache for port d70a594c-be8a-461a-93b0-7416d3587e74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:42:54 np0005588919 nova_compute[225855]: 2026-01-20 14:42:54.923 225859 DEBUG nova.network.neutron [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Updating instance_info_cache with network_info: [{"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:55.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.332 225859 DEBUG oslo_concurrency.lockutils [req-e73b78ea-1aad-4382-b311-09a10979085d req-b13176a9-0f93-4c8e-92b8-f64fd3bff8f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.400 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.401 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920175.3996596, 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.402 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.407 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.411 225859 INFO nova.virt.libvirt.driver [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Instance spawned successfully.#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.412 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.447 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.451 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.451 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.452 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.453 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.454 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.455 225859 DEBUG nova.virt.libvirt.driver [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.466 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.507 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.507 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920175.4019752, 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.507 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.539 225859 INFO nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Took 9.03 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.540 225859 DEBUG nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.545 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.553 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920175.4051468, 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.554 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.600 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.604 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.627 225859 INFO nova.compute.manager [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Took 10.17 seconds to build instance.#033[00m
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.642 225859 DEBUG oslo_concurrency.lockutils [None req-70aa2a94-0c30-4957-9bb8-59360dd40b1d aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:55Z|00272|binding|INFO|Releasing lport 5dcae274-b8f4-440a-a3eb-5c1a5a044346 from this chassis (sb_readonly=0)
Jan 20 09:42:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:55Z|00273|binding|INFO|Releasing lport 2b7c295d-f074-4cfb-aca0-08946126ddbc from this chassis (sb_readonly=0)
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.700 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:55 np0005588919 podman[256384]: 2026-01-20 14:42:55.71475754 +0000 UTC m=+1.263677490 container create 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:42:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:55Z|00274|binding|INFO|Releasing lport 5dcae274-b8f4-440a-a3eb-5c1a5a044346 from this chassis (sb_readonly=0)
Jan 20 09:42:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:55Z|00275|binding|INFO|Releasing lport 2b7c295d-f074-4cfb-aca0-08946126ddbc from this chassis (sb_readonly=0)
Jan 20 09:42:55 np0005588919 nova_compute[225855]: 2026-01-20 14:42:55.939 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:56 np0005588919 systemd[1]: Started libpod-conmon-6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16.scope.
Jan 20 09:42:56 np0005588919 podman[256415]: 2026-01-20 14:42:56.038238651 +0000 UTC m=+1.081009086 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:42:56 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:42:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e211a072461adf5d122a2cd3877c75e44066b196736bc398b06e937a9610cce8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:42:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:56.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:56 np0005588919 podman[256384]: 2026-01-20 14:42:56.217084956 +0000 UTC m=+1.766004956 container init 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:42:56 np0005588919 podman[256384]: 2026-01-20 14:42:56.223086716 +0000 UTC m=+1.772006676 container start 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 09:42:56 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [NOTICE]   (256523) : New worker (256525) forked
Jan 20 09:42:56 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [NOTICE]   (256523) : Loading success.
Jan 20 09:42:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:42:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.968 225859 DEBUG nova.compute.manager [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.969 225859 DEBUG oslo_concurrency.lockutils [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.969 225859 DEBUG oslo_concurrency.lockutils [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.970 225859 DEBUG oslo_concurrency.lockutils [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.970 225859 DEBUG nova.compute.manager [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] No waiting events found dispatching network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:56 np0005588919 nova_compute[225855]: 2026-01-20 14:42:56.971 225859 WARNING nova.compute.manager [req-3ba1085e-8d8e-4e0f-a008-7593ee3e319d req-edb24761-d2f4-45e6-9e38-8d2b0ebf4438 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received unexpected event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.031 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.032 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.033 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.033 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.034 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.036 225859 INFO nova.compute.manager [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Terminating instance#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.037 225859 DEBUG nova.compute.manager [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:42:57 np0005588919 kernel: tapd70a594c-be (unregistering): left promiscuous mode
Jan 20 09:42:57 np0005588919 NetworkManager[49104]: <info>  [1768920177.0832] device (tapd70a594c-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:42:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:57Z|00276|binding|INFO|Releasing lport d70a594c-be8a-461a-93b0-7416d3587e74 from this chassis (sb_readonly=0)
Jan 20 09:42:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:57Z|00277|binding|INFO|Setting lport d70a594c-be8a-461a-93b0-7416d3587e74 down in Southbound
Jan 20 09:42:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:42:57Z|00278|binding|INFO|Removing iface tapd70a594c-be ovn-installed in OVS
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.152 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:3a:34 10.100.0.12'], port_security=['fa:16:3e:11:3a:34 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '69cc4cf9-dfe3-44cb-b811-0300b5ccd66b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3e022a35f604df2bbc885e498b1e206', 'neutron:revision_number': '4', 'neutron:security_group_ids': '885819b7-5060-4b73-ad54-3f31f821195c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89e607b1-9e39-47f0-8180-8aaef3a2a0e9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d70a594c-be8a-461a-93b0-7416d3587e74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.154 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d70a594c-be8a-461a-93b0-7416d3587e74 in datapath 3e260ad9-fcf1-432b-b71b-b943d4249b65 unbound from our chassis#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.155 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e260ad9-fcf1-432b-b71b-b943d4249b65, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.156 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f6c8b8-9a3e-4173-ac41-b5aeac4a1093]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.156 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 namespace which is not needed anymore#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.159 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 20 09:42:57 np0005588919 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004d.scope: Consumed 2.677s CPU time.
Jan 20 09:42:57 np0005588919 systemd-machined[194361]: Machine qemu-33-instance-0000004d terminated.
Jan 20 09:42:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:57.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:57 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [NOTICE]   (256523) : haproxy version is 2.8.14-c23fe91
Jan 20 09:42:57 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [NOTICE]   (256523) : path to executable is /usr/sbin/haproxy
Jan 20 09:42:57 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [WARNING]  (256523) : Exiting Master process...
Jan 20 09:42:57 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [ALERT]    (256523) : Current worker (256525) exited with code 143 (Terminated)
Jan 20 09:42:57 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[256494]: [WARNING]  (256523) : All workers exited. Exiting... (0)
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.278 225859 INFO nova.virt.libvirt.driver [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Instance destroyed successfully.#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.278 225859 DEBUG nova.objects.instance [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'resources' on Instance uuid 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:42:57 np0005588919 systemd[1]: libpod-6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16.scope: Deactivated successfully.
Jan 20 09:42:57 np0005588919 podman[256557]: 2026-01-20 14:42:57.292494664 +0000 UTC m=+0.053965290 container died 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.295 225859 DEBUG nova.virt.libvirt.vif [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:42:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1574727229',display_name='tempest-tempest.common.compute-instance-1574727229-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1574727229-1',id=77,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-6khz0z7o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:55Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=69cc4cf9-dfe3-44cb-b811-0300b5ccd66b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.295 225859 DEBUG nova.network.os_vif_util [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "d70a594c-be8a-461a-93b0-7416d3587e74", "address": "fa:16:3e:11:3a:34", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd70a594c-be", "ovs_interfaceid": "d70a594c-be8a-461a-93b0-7416d3587e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.296 225859 DEBUG nova.network.os_vif_util [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.296 225859 DEBUG os_vif [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.298 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd70a594c-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.304 225859 INFO os_vif [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3a:34,bridge_name='br-int',has_traffic_filtering=True,id=d70a594c-be8a-461a-93b0-7416d3587e74,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd70a594c-be')#033[00m
Jan 20 09:42:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16-userdata-shm.mount: Deactivated successfully.
Jan 20 09:42:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e211a072461adf5d122a2cd3877c75e44066b196736bc398b06e937a9610cce8-merged.mount: Deactivated successfully.
Jan 20 09:42:57 np0005588919 podman[256557]: 2026-01-20 14:42:57.336174121 +0000 UTC m=+0.097644737 container cleanup 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:42:57 np0005588919 systemd[1]: libpod-conmon-6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16.scope: Deactivated successfully.
Jan 20 09:42:57 np0005588919 podman[256616]: 2026-01-20 14:42:57.405138284 +0000 UTC m=+0.045695165 container remove 6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.410 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9a5254-c8e1-418f-bd1c-9c0b9ad406ae]: (4, ('Tue Jan 20 02:42:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 (6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16)\n6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16\nTue Jan 20 02:42:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 (6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16)\n6f8eed69f823a3fdf11e8147104e3b8cd6fb1499003504a66e3d765dff92ee16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.412 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[023c7a34-5e94-41c5-8068-67b4214c75cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.413 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e260ad9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 kernel: tap3e260ad9-f0: left promiscuous mode
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.435 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.437 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e017f52d-0660-4300-84bc-9a9c4952f928]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.453 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a09092cf-6893-469d-9245-ebfe217324ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.454 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1a9276-debe-4fd2-8e03-37c0336579f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.478 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c689a2-e9ce-460b-9567-571f8f6b3b12]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521280, 'reachable_time': 43298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256631, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.482 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:42:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:42:57.482 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ac09506a-aaa5-4357-8d6e-9595cf0de74d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:42:57 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3e260ad9\x2dfcf1\x2d432b\x2db71b\x2db943d4249b65.mount: Deactivated successfully.
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.693 225859 INFO nova.virt.libvirt.driver [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Deleting instance files /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_del#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.694 225859 INFO nova.virt.libvirt.driver [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Deletion of /var/lib/nova/instances/69cc4cf9-dfe3-44cb-b811-0300b5ccd66b_del complete#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.752 225859 INFO nova.compute.manager [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.753 225859 DEBUG oslo.service.loopingcall [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.754 225859 DEBUG nova.compute.manager [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:42:57 np0005588919 nova_compute[225855]: 2026-01-20 14:42:57.754 225859 DEBUG nova.network.neutron [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:42:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:42:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:42:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:42:58 np0005588919 nova_compute[225855]: 2026-01-20 14:42:58.571 225859 DEBUG nova.network.neutron [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:42:58 np0005588919 nova_compute[225855]: 2026-01-20 14:42:58.588 225859 INFO nova.compute.manager [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Took 0.83 seconds to deallocate network for instance.#033[00m
Jan 20 09:42:58 np0005588919 nova_compute[225855]: 2026-01-20 14:42:58.708 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:58 np0005588919 nova_compute[225855]: 2026-01-20 14:42:58.708 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:58 np0005588919 nova_compute[225855]: 2026-01-20 14:42:58.842 225859 DEBUG oslo_concurrency.processutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.160 225859 DEBUG nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-vif-unplugged-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.161 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.161 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.162 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.162 225859 DEBUG nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] No waiting events found dispatching network-vif-unplugged-d70a594c-be8a-461a-93b0-7416d3587e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.162 225859 WARNING nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received unexpected event network-vif-unplugged-d70a594c-be8a-461a-93b0-7416d3587e74 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.162 225859 DEBUG nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.162 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.163 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.163 225859 DEBUG oslo_concurrency.lockutils [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.164 225859 DEBUG nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] No waiting events found dispatching network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.164 225859 WARNING nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received unexpected event network-vif-plugged-d70a594c-be8a-461a-93b0-7416d3587e74 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.164 225859 DEBUG nova.compute.manager [req-c4ea4a7e-04f4-415c-bc6b-5c7d7ee41ceb req-7777d7b8-1c9c-4215-bf62-3f042b8dc1d8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Received event network-vif-deleted-d70a594c-be8a-461a-93b0-7416d3587e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:42:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:42:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:42:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:42:59.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:42:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:42:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4040533622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.277 225859 DEBUG oslo_concurrency.processutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.282 225859 DEBUG nova.compute.provider_tree [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.296 225859 DEBUG nova.scheduler.client.report [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.326 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.391 225859 INFO nova.scheduler.client.report [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Deleted allocations for instance 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b#033[00m
Jan 20 09:42:59 np0005588919 nova_compute[225855]: 2026-01-20 14:42:59.442 225859 DEBUG oslo_concurrency.lockutils [None req-d75a9290-3b10-49e0-a981-9cf010d1562f aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "69cc4cf9-dfe3-44cb-b811-0300b5ccd66b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:00.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:00 np0005588919 nova_compute[225855]: 2026-01-20 14:43:00.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:01.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:02.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:02 np0005588919 nova_compute[225855]: 2026-01-20 14:43:02.346 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:03.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:04 np0005588919 podman[256659]: 2026-01-20 14:43:04.035799706 +0000 UTC m=+0.073383379 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:04.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.548 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.549 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.566 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.668 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.669 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.676 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.676 225859 INFO nova.compute.claims [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:43:04 np0005588919 nova_compute[225855]: 2026-01-20 14:43:04.835 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:05.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2608020822' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.268 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.273 225859 DEBUG nova.compute.provider_tree [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.296 225859 DEBUG nova.scheduler.client.report [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.338 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.338 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.412 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.412 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.434 225859 INFO nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.460 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.610 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.612 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.612 225859 INFO nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Creating image(s)#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.654 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.682 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.706 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.710 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.746 225859 DEBUG nova.policy [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aa2e7857e85f483eb0d162e2ee8c2e2c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3e022a35f604df2bbc885e498b1e206', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.799 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.799 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.800 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.800 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.823 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:05 np0005588919 nova_compute[225855]: 2026-01-20 14:43:05.827 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:06.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.284 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.347 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] resizing rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.438 225859 DEBUG nova.objects.instance [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'migration_context' on Instance uuid cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.454 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.454 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Ensure instance console log exists: /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.454 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.455 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:06 np0005588919 nova_compute[225855]: 2026-01-20 14:43:06.455 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:07.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.383 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.603 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Successfully created port: 221681ce-86ed-410e-8ca2-52951142fede _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.933 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.933 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.934 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.934 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.935 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.937 225859 INFO nova.compute.manager [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Terminating instance#033[00m
Jan 20 09:43:07 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.939 225859 DEBUG nova.compute.manager [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:43:07 np0005588919 kernel: tap349b1d10-0b (unregistering): left promiscuous mode
Jan 20 09:43:07 np0005588919 NetworkManager[49104]: <info>  [1768920187.9905] device (tap349b1d10-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:07.999 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:08Z|00279|binding|INFO|Releasing lport 349b1d10-0b06-4025-80fd-4861bd487a43 from this chassis (sb_readonly=0)
Jan 20 09:43:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:08Z|00280|binding|INFO|Setting lport 349b1d10-0b06-4025-80fd-4861bd487a43 down in Southbound
Jan 20 09:43:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:08Z|00281|binding|INFO|Removing iface tap349b1d10-0b ovn-installed in OVS
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.005 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:f6:26 10.100.0.12'], port_security=['fa:16:3e:dd:f6:26 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '504acd93-cd55-496e-a85f-30e811f827d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b95747114ab4043b93a260387199c91', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f18b0222-78a5-4c37-8065-772dbe5c63e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80e2aa5b-ecb8-4e93-992f-baaef718dd34, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=349b1d10-0b06-4025-80fd-4861bd487a43) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.006 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 349b1d10-0b06-4025-80fd-4861bd487a43 in datapath b36e9cab-12c6-4a09-9aab-ef2679d875ba unbound from our chassis#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.008 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b36e9cab-12c6-4a09-9aab-ef2679d875ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.009 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[200f21cd-d85c-4224-9ec4-8294749854df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.009 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba namespace which is not needed anymore#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 20 09:43:08 np0005588919 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004b.scope: Consumed 14.487s CPU time.
Jan 20 09:43:08 np0005588919 systemd-machined[194361]: Machine qemu-32-instance-0000004b terminated.
Jan 20 09:43:08 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [NOTICE]   (255528) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:08 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [NOTICE]   (255528) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:08 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [WARNING]  (255528) : Exiting Master process...
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.167 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [ALERT]    (255528) : Current worker (255530) exited with code 143 (Terminated)
Jan 20 09:43:08 np0005588919 neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba[255521]: [WARNING]  (255528) : All workers exited. Exiting... (0)
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.173 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 systemd[1]: libpod-a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805.scope: Deactivated successfully.
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.178 225859 INFO nova.virt.libvirt.driver [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Instance destroyed successfully.#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.178 225859 DEBUG nova.objects.instance [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lazy-loading 'resources' on Instance uuid 504acd93-cd55-496e-a85f-30e811f827d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:08 np0005588919 podman[256943]: 2026-01-20 14:43:08.179527393 +0000 UTC m=+0.055816551 container died a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.194 225859 DEBUG nova.virt.libvirt.vif [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:42:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1822690739',display_name='tempest-ListServerFiltersTestJSON-instance-1822690739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1822690739',id=75,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:42:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b95747114ab4043b93a260387199c91',ramdisk_id='',reservation_id='r-c4vxbkd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-2126845308',owner_user_name='tempest-ListServerFiltersTestJSON-2126845308-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:42:26Z,user_data=None,user_id='ff99fc8eda0640928c6e82981dacb266',uuid=504acd93-cd55-496e-a85f-30e811f827d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.195 225859 DEBUG nova.network.os_vif_util [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converting VIF {"id": "349b1d10-0b06-4025-80fd-4861bd487a43", "address": "fa:16:3e:dd:f6:26", "network": {"id": "b36e9cab-12c6-4a09-9aab-ef2679d875ba", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-432532406-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b95747114ab4043b93a260387199c91", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap349b1d10-0b", "ovs_interfaceid": "349b1d10-0b06-4025-80fd-4861bd487a43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.196 225859 DEBUG nova.network.os_vif_util [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.196 225859 DEBUG os_vif [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:08.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.199 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.199 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap349b1d10-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.201 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.206 225859 INFO os_vif [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:f6:26,bridge_name='br-int',has_traffic_filtering=True,id=349b1d10-0b06-4025-80fd-4861bd487a43,network=Network(b36e9cab-12c6-4a09-9aab-ef2679d875ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap349b1d10-0b')#033[00m
Jan 20 09:43:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay-bd3ad48162a1b4dec3cf75f42139906849c8a6a6a6b10f13149b76909d80e15f-merged.mount: Deactivated successfully.
Jan 20 09:43:08 np0005588919 podman[256943]: 2026-01-20 14:43:08.220674989 +0000 UTC m=+0.096964147 container cleanup a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:08 np0005588919 systemd[1]: libpod-conmon-a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805.scope: Deactivated successfully.
Jan 20 09:43:08 np0005588919 podman[256999]: 2026-01-20 14:43:08.288016636 +0000 UTC m=+0.044546593 container remove a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.293 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbe5786-6f48-4e9d-ac70-44ef7f7a3a6d]: (4, ('Tue Jan 20 02:43:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba (a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805)\na0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805\nTue Jan 20 02:43:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba (a0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805)\na0f03681d47f02379595b6b5e8bf0975841c981497c7672805924261893db805\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.295 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3902b087-f641-44ef-a25e-06478f1e8d0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.296 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb36e9cab-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:08 np0005588919 kernel: tapb36e9cab-10: left promiscuous mode
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.311 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.313 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4db99a-a64f-46ab-8393-d41f3dff63c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.331 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8ffea9b2-2f6d-4378-827b-0113f7f0b432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.333 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[08c46e92-006f-4db2-b84f-e1a10269830a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.348 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb352566-0b2f-4b93-a379-f818e2b2d1f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518397, 'reachable_time': 22824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257017, 'error': None, 'target': 'ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.351 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b36e9cab-12c6-4a09-9aab-ef2679d875ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:08.351 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[be58979c-eb33-44cf-bb16-ca19937b45e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:08 np0005588919 systemd[1]: run-netns-ovnmeta\x2db36e9cab\x2d12c6\x2d4a09\x2d9aab\x2def2679d875ba.mount: Deactivated successfully.
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.382 225859 DEBUG nova.compute.manager [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-unplugged-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.383 225859 DEBUG oslo_concurrency.lockutils [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.383 225859 DEBUG oslo_concurrency.lockutils [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.383 225859 DEBUG oslo_concurrency.lockutils [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.384 225859 DEBUG nova.compute.manager [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] No waiting events found dispatching network-vif-unplugged-349b1d10-0b06-4025-80fd-4861bd487a43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.384 225859 DEBUG nova.compute.manager [req-fca3f18b-19d4-4e59-b9cf-6920f7d0ff38 req-0f32bb6b-8d4f-4096-8657-f6d072b798d1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-unplugged-349b1d10-0b06-4025-80fd-4861bd487a43 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.587 225859 INFO nova.virt.libvirt.driver [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Deleting instance files /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4_del#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.589 225859 INFO nova.virt.libvirt.driver [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Deletion of /var/lib/nova/instances/504acd93-cd55-496e-a85f-30e811f827d4_del complete#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.654 225859 INFO nova.compute.manager [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.655 225859 DEBUG oslo.service.loopingcall [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.655 225859 DEBUG nova.compute.manager [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:43:08 np0005588919 nova_compute[225855]: 2026-01-20 14:43:08.656 225859 DEBUG nova.network.neutron [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:43:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:09.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.548 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Successfully updated port: 221681ce-86ed-410e-8ca2-52951142fede _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.568 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.569 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquired lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.569 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.656 225859 DEBUG nova.network.neutron [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.685 225859 INFO nova.compute.manager [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Took 1.03 seconds to deallocate network for instance.#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.737 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.738 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.799 225859 DEBUG oslo_concurrency.processutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:09 np0005588919 nova_compute[225855]: 2026-01-20 14:43:09.920 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:10.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1201707774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.231 225859 DEBUG oslo_concurrency.processutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.238 225859 DEBUG nova.compute.provider_tree [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.263 225859 DEBUG nova.scheduler.client.report [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.296 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.332 225859 INFO nova.scheduler.client.report [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Deleted allocations for instance 504acd93-cd55-496e-a85f-30e811f827d4#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.423 225859 DEBUG oslo_concurrency.lockutils [None req-31fee689-1eb0-4822-b907-59dd8bf5bae2 ff99fc8eda0640928c6e82981dacb266 4b95747114ab4043b93a260387199c91 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.490s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.500 225859 DEBUG nova.compute.manager [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.500 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "504acd93-cd55-496e-a85f-30e811f827d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.501 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.501 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "504acd93-cd55-496e-a85f-30e811f827d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.501 225859 DEBUG nova.compute.manager [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] No waiting events found dispatching network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.502 225859 WARNING nova.compute.manager [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received unexpected event network-vif-plugged-349b1d10-0b06-4025-80fd-4861bd487a43 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.502 225859 DEBUG nova.compute.manager [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-changed-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.502 225859 DEBUG nova.compute.manager [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Refreshing instance network info cache due to event network-changed-221681ce-86ed-410e-8ca2-52951142fede. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:10 np0005588919 nova_compute[225855]: 2026-01-20 14:43:10.502 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:11.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.548 225859 DEBUG nova.network.neutron [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Updating instance_info_cache with network_info: [{"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.573 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Releasing lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.574 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Instance network_info: |[{"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.574 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.575 225859 DEBUG nova.network.neutron [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Refreshing network info cache for port 221681ce-86ed-410e-8ca2-52951142fede _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.578 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Start _get_guest_xml network_info=[{"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.583 225859 WARNING nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.590 225859 DEBUG nova.virt.libvirt.host [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:43:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.591 225859 DEBUG nova.virt.libvirt.host [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.600 225859 DEBUG nova.virt.libvirt.host [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.600 225859 DEBUG nova.virt.libvirt.host [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.601 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.602 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.602 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.603 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.603 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.603 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.604 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.604 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.605 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.605 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.605 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.606 225859 DEBUG nova.virt.hardware [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.609 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:11 np0005588919 nova_compute[225855]: 2026-01-20 14:43:11.654 225859 DEBUG nova.compute.manager [req-41984bb9-b218-4f0e-b367-62ac0cc7320e req-420d6751-7652-4089-8c3a-1b03d978b978 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Received event network-vif-deleted-349b1d10-0b06-4025-80fd-4861bd487a43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3879070571' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.028 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.062 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.067 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:12.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.276 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920177.2757256, 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.278 225859 INFO nova.compute.manager [-] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:43:12 np0005588919 nova_compute[225855]: 2026-01-20 14:43:12.311 225859 DEBUG nova.compute.manager [None req-c97783c1-d633-4d0d-9b04-906461cf5274 - - - - - -] [instance: 69cc4cf9-dfe3-44cb-b811-0300b5ccd66b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1554951386' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.044 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.977s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.046 225859 DEBUG nova.virt.libvirt.vif [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-704404998',display_name='tempest-MultipleCreateTestJSON-server-704404998-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-704404998-1',id=80,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-0enj9v31',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:05Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=cd7f8cdc-1467-4d67-b60b-dd2ee8707b09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.046 225859 DEBUG nova.network.os_vif_util [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.047 225859 DEBUG nova.network.os_vif_util [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.049 225859 DEBUG nova.objects.instance [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'pci_devices' on Instance uuid cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.063 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <uuid>cd7f8cdc-1467-4d67-b60b-dd2ee8707b09</uuid>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <name>instance-00000050</name>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:name>tempest-MultipleCreateTestJSON-server-704404998-1</nova:name>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:43:11</nova:creationTime>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:user uuid="aa2e7857e85f483eb0d162e2ee8c2e2c">tempest-MultipleCreateTestJSON-164394330-project-member</nova:user>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:project uuid="a3e022a35f604df2bbc885e498b1e206">tempest-MultipleCreateTestJSON-164394330</nova:project>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <nova:port uuid="221681ce-86ed-410e-8ca2-52951142fede">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="serial">cd7f8cdc-1467-4d67-b60b-dd2ee8707b09</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="uuid">cd7f8cdc-1467-4d67-b60b-dd2ee8707b09</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:eb:87:b2"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <target dev="tap221681ce-86"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/console.log" append="off"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:43:13 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:43:13 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:43:13 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:43:13 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.064 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Preparing to wait for external event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.064 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.065 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.065 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.066 225859 DEBUG nova.virt.libvirt.vif [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-704404998',display_name='tempest-MultipleCreateTestJSON-server-704404998-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-704404998-1',id=80,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-0enj9v31',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:05Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=cd7f8cdc-1467-4d67-b60b-dd2ee8707b09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.067 225859 DEBUG nova.network.os_vif_util [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.068 225859 DEBUG nova.network.os_vif_util [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.068 225859 DEBUG os_vif [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.070 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.071 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.075 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap221681ce-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.075 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap221681ce-86, col_values=(('external_ids', {'iface-id': '221681ce-86ed-410e-8ca2-52951142fede', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:87:b2', 'vm-uuid': 'cd7f8cdc-1467-4d67-b60b-dd2ee8707b09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.077 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588919 NetworkManager[49104]: <info>  [1768920193.0781] manager: (tap221681ce-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.085 225859 INFO os_vif [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86')#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.134 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.134 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.135 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] No VIF found with MAC fa:16:3e:eb:87:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.135 225859 INFO nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Using config drive#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.163 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:13.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.626 225859 DEBUG nova.network.neutron [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Updated VIF entry in instance network info cache for port 221681ce-86ed-410e-8ca2-52951142fede. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.627 225859 DEBUG nova.network.neutron [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Updating instance_info_cache with network_info: [{"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.659 225859 DEBUG oslo_concurrency.lockutils [req-d6d7e4f8-914b-4f70-af34-9910e0e37aaf req-840959d0-7393-4732-9287-c4c366e266a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.767 225859 INFO nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Creating config drive at /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.772 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_2ciybvh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:13 np0005588919 nova_compute[225855]: 2026-01-20 14:43:13.918 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_2ciybvh" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.000 225859 DEBUG nova.storage.rbd_utils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] rbd image cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.004 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:14.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.273 225859 DEBUG oslo_concurrency.processutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.269s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.274 225859 INFO nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Deleting local config drive /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09/disk.config because it was imported into RBD.#033[00m
Jan 20 09:43:14 np0005588919 kernel: tap221681ce-86: entered promiscuous mode
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.3202] manager: (tap221681ce-86): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Jan 20 09:43:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:14Z|00282|binding|INFO|Claiming lport 221681ce-86ed-410e-8ca2-52951142fede for this chassis.
Jan 20 09:43:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:14Z|00283|binding|INFO|221681ce-86ed-410e-8ca2-52951142fede: Claiming fa:16:3e:eb:87:b2 10.100.0.10
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.320 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.332 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:87:b2 10.100.0.10'], port_security=['fa:16:3e:eb:87:b2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'cd7f8cdc-1467-4d67-b60b-dd2ee8707b09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3e022a35f604df2bbc885e498b1e206', 'neutron:revision_number': '2', 'neutron:security_group_ids': '885819b7-5060-4b73-ad54-3f31f821195c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89e607b1-9e39-47f0-8180-8aaef3a2a0e9, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=221681ce-86ed-410e-8ca2-52951142fede) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.333 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 221681ce-86ed-410e-8ca2-52951142fede in datapath 3e260ad9-fcf1-432b-b71b-b943d4249b65 bound to our chassis#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.334 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e260ad9-fcf1-432b-b71b-b943d4249b65#033[00m
Jan 20 09:43:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:14Z|00284|binding|INFO|Setting lport 221681ce-86ed-410e-8ca2-52951142fede ovn-installed in OVS
Jan 20 09:43:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:14Z|00285|binding|INFO|Setting lport 221681ce-86ed-410e-8ca2-52951142fede up in Southbound
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.343 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.346 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4b261033-606a-4277-bfcd-6638aa951dbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.347 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e260ad9-f1 in ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:43:14 np0005588919 systemd-udevd[257181]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.349 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e260ad9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.349 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3ac578-6f61-4560-b2d6-23251ed410d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.350 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1916203e-0401-40c8-8c17-503bbc51bf49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 systemd-machined[194361]: New machine qemu-34-instance-00000050.
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.3601] device (tap221681ce-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.3608] device (tap221681ce-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.362 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[977890a5-880e-45b2-bb9f-95d5598c8093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 systemd[1]: Started Virtual Machine qemu-34-instance-00000050.
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.388 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d66a698a-ca8a-47f6-9a1f-dfef8facdba1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.428 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7cc9d9-1ded-4ca4-a096-1af2a3b7beaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.436 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d7abbb25-b1eb-4620-af91-475bf71ae319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.4378] manager: (tap3e260ad9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Jan 20 09:43:14 np0005588919 systemd-udevd[257185]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.474 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[611fe7f2-8524-4944-9306-ba7480e06c67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.477 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[95c29bc0-4f5b-4155-827e-8e7e8da9a76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.5045] device (tap3e260ad9-f0): carrier: link connected
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.509 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[69b6d08c-db6d-40d5-b02b-bd1fa6fa15a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.524 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b941170-f71f-4e82-8fb1-4f74dc9c5dec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e260ad9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:13:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523343, 'reachable_time': 40244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257214, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.536 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[021362b0-3b13-46b2-8bcf-3d9bd0c832dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:134a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523343, 'tstamp': 523343}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257215, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.551 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c73fe823-de3d-4106-8785-e2786622c254]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e260ad9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:13:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523343, 'reachable_time': 40244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257216, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.582 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[08dea712-17fe-44f4-bc33-9b5cfe6b8776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.640 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[286fbc30-7881-4459-83e1-601e2d48c0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.642 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e260ad9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.642 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.643 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e260ad9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588919 NetworkManager[49104]: <info>  [1768920194.6452] manager: (tap3e260ad9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 kernel: tap3e260ad9-f0: entered promiscuous mode
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.649 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e260ad9-f0, col_values=(('external_ids', {'iface-id': '2b7c295d-f074-4cfb-aca0-08946126ddbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:14Z|00286|binding|INFO|Releasing lport 2b7c295d-f074-4cfb-aca0-08946126ddbc from this chassis (sb_readonly=0)
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.665 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.667 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.668 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.669 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[74af6530-6e4c-4c1a-b301-2e9465f4d6d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.670 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3e260ad9-fcf1-432b-b71b-b943d4249b65
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3e260ad9-fcf1-432b-b71b-b943d4249b65.pid.haproxy
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3e260ad9-fcf1-432b-b71b-b943d4249b65
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:43:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:14.672 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'env', 'PROCESS_TAG=haproxy-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e260ad9-fcf1-432b-b71b-b943d4249b65.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.704 225859 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.704 225859 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.704 225859 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.705 225859 DEBUG oslo_concurrency.lockutils [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.705 225859 DEBUG nova.compute.manager [req-574f4a1f-765d-44fa-8b09-01c1c5be9a03 req-976d640a-57e5-4146-9da6-ecadb28276e3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Processing event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.907 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920194.9065466, cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.907 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] VM Started (Lifecycle Event)#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.910 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.914 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.917 225859 INFO nova.virt.libvirt.driver [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Instance spawned successfully.#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.918 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.931 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.936 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.940 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.941 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.941 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.942 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.942 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.943 225859 DEBUG nova.virt.libvirt.driver [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.953 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.954 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920194.9067376, cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.954 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.973 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.977 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920194.9127798, cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.977 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.995 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:14 np0005588919 nova_compute[225855]: 2026-01-20 14:43:14.998 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.004 225859 INFO nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Took 9.39 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.004 225859 DEBUG nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:15.028 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.028 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:15 np0005588919 podman[257288]: 2026-01-20 14:43:15.031818511 +0000 UTC m=+0.051844519 container create c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.033 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.070 225859 INFO nova.compute.manager [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Took 10.45 seconds to build instance.#033[00m
Jan 20 09:43:15 np0005588919 systemd[1]: Started libpod-conmon-c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5.scope.
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.085 225859 DEBUG oslo_concurrency.lockutils [None req-1d74c129-e009-471c-8543-d5e591c353cb aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:15 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:43:15 np0005588919 podman[257288]: 2026-01-20 14:43:15.003993153 +0000 UTC m=+0.024019181 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:43:15 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90a6cff7ad5f6f51949488ae9006978cd1a88ea356309dfcbb35378ee1a81275/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:43:15 np0005588919 podman[257288]: 2026-01-20 14:43:15.114249646 +0000 UTC m=+0.134275654 container init c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:43:15 np0005588919 podman[257288]: 2026-01-20 14:43:15.120629527 +0000 UTC m=+0.140655535 container start c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:43:15 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [NOTICE]   (257307) : New worker (257309) forked
Jan 20 09:43:15 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [NOTICE]   (257307) : Loading success.
Jan 20 09:43:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:15.197 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:43:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:15.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:15 np0005588919 nova_compute[225855]: 2026-01-20 14:43:15.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:16.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:16.404 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:16.404 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:16.405 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.857 225859 DEBUG nova.compute.manager [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.857 225859 DEBUG oslo_concurrency.lockutils [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.858 225859 DEBUG oslo_concurrency.lockutils [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.858 225859 DEBUG oslo_concurrency.lockutils [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.858 225859 DEBUG nova.compute.manager [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] No waiting events found dispatching network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:16 np0005588919 nova_compute[225855]: 2026-01-20 14:43:16.859 225859 WARNING nova.compute.manager [req-1fcc8e9a-5e24-4c69-8176-77db6ff7bba8 req-3d31ebe3-3855-4c23-95de-edbd17dc3bcf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received unexpected event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede for instance with vm_state active and task_state None.#033[00m
Jan 20 09:43:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:17.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.199 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:18.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.258 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.258 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.258 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.259 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.259 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.260 225859 INFO nova.compute.manager [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Terminating instance#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.261 225859 DEBUG nova.compute.manager [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:43:18 np0005588919 kernel: tap221681ce-86 (unregistering): left promiscuous mode
Jan 20 09:43:18 np0005588919 NetworkManager[49104]: <info>  [1768920198.3024] device (tap221681ce-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.311 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:18Z|00287|binding|INFO|Releasing lport 221681ce-86ed-410e-8ca2-52951142fede from this chassis (sb_readonly=0)
Jan 20 09:43:18 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:18Z|00288|binding|INFO|Setting lport 221681ce-86ed-410e-8ca2-52951142fede down in Southbound
Jan 20 09:43:18 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:18Z|00289|binding|INFO|Removing iface tap221681ce-86 ovn-installed in OVS
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.314 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.322 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:87:b2 10.100.0.10'], port_security=['fa:16:3e:eb:87:b2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'cd7f8cdc-1467-4d67-b60b-dd2ee8707b09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a3e022a35f604df2bbc885e498b1e206', 'neutron:revision_number': '4', 'neutron:security_group_ids': '885819b7-5060-4b73-ad54-3f31f821195c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89e607b1-9e39-47f0-8180-8aaef3a2a0e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=221681ce-86ed-410e-8ca2-52951142fede) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.323 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 221681ce-86ed-410e-8ca2-52951142fede in datapath 3e260ad9-fcf1-432b-b71b-b943d4249b65 unbound from our chassis#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.325 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e260ad9-fcf1-432b-b71b-b943d4249b65, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.326 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f103673-73e6-427e-b74f-20e0f7779f24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.327 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 namespace which is not needed anymore#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.341 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 20 09:43:18 np0005588919 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Consumed 3.860s CPU time.
Jan 20 09:43:18 np0005588919 systemd-machined[194361]: Machine qemu-34-instance-00000050 terminated.
Jan 20 09:43:18 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:18Z|00290|binding|INFO|Releasing lport 2b7c295d-f074-4cfb-aca0-08946126ddbc from this chassis (sb_readonly=0)
Jan 20 09:43:18 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [NOTICE]   (257307) : haproxy version is 2.8.14-c23fe91
Jan 20 09:43:18 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [NOTICE]   (257307) : path to executable is /usr/sbin/haproxy
Jan 20 09:43:18 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [WARNING]  (257307) : Exiting Master process...
Jan 20 09:43:18 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [ALERT]    (257307) : Current worker (257309) exited with code 143 (Terminated)
Jan 20 09:43:18 np0005588919 neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65[257303]: [WARNING]  (257307) : All workers exited. Exiting... (0)
Jan 20 09:43:18 np0005588919 systemd[1]: libpod-c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5.scope: Deactivated successfully.
Jan 20 09:43:18 np0005588919 conmon[257303]: conmon c6d47773a7b5e703ee98 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5.scope/container/memory.events
Jan 20 09:43:18 np0005588919 podman[257346]: 2026-01-20 14:43:18.474601746 +0000 UTC m=+0.050490411 container died c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:43:18 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5-userdata-shm.mount: Deactivated successfully.
Jan 20 09:43:18 np0005588919 systemd[1]: var-lib-containers-storage-overlay-90a6cff7ad5f6f51949488ae9006978cd1a88ea356309dfcbb35378ee1a81275-merged.mount: Deactivated successfully.
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.518 225859 INFO nova.virt.libvirt.driver [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Instance destroyed successfully.#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.520 225859 DEBUG nova.objects.instance [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lazy-loading 'resources' on Instance uuid cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.527 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 podman[257346]: 2026-01-20 14:43:18.528645437 +0000 UTC m=+0.104534092 container cleanup c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.534 225859 DEBUG nova.virt.libvirt.vif [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-704404998',display_name='tempest-MultipleCreateTestJSON-server-704404998-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-704404998-1',id=80,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:43:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a3e022a35f604df2bbc885e498b1e206',ramdisk_id='',reservation_id='r-0enj9v31',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-164394330',owner_user_name='tempest-MultipleCreateTestJSON-164394330-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:43:15Z,user_data=None,user_id='aa2e7857e85f483eb0d162e2ee8c2e2c',uuid=cd7f8cdc-1467-4d67-b60b-dd2ee8707b09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.535 225859 DEBUG nova.network.os_vif_util [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converting VIF {"id": "221681ce-86ed-410e-8ca2-52951142fede", "address": "fa:16:3e:eb:87:b2", "network": {"id": "3e260ad9-fcf1-432b-b71b-b943d4249b65", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1425882684-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a3e022a35f604df2bbc885e498b1e206", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap221681ce-86", "ovs_interfaceid": "221681ce-86ed-410e-8ca2-52951142fede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.536 225859 DEBUG nova.network.os_vif_util [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.536 225859 DEBUG os_vif [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.538 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.539 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap221681ce-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.540 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 systemd[1]: libpod-conmon-c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5.scope: Deactivated successfully.
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.543 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.545 225859 INFO os_vif [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:87:b2,bridge_name='br-int',has_traffic_filtering=True,id=221681ce-86ed-410e-8ca2-52951142fede,network=Network(3e260ad9-fcf1-432b-b71b-b943d4249b65),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap221681ce-86')#033[00m
Jan 20 09:43:18 np0005588919 podman[257381]: 2026-01-20 14:43:18.594631786 +0000 UTC m=+0.042300879 container remove c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.600 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[301afa8f-8001-41da-9f06-b3a59ba7841b]: (4, ('Tue Jan 20 02:43:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 (c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5)\nc6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5\nTue Jan 20 02:43:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 (c6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5)\nc6d47773a7b5e703ee98e9aa55358d1326e7e172882283f62037bdb5729618d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.602 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4786ad77-10d6-4465-bddc-f9348330772c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.602 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e260ad9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.604 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 kernel: tap3e260ad9-f0: left promiscuous mode
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.618 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a13ecaed-ac1b-4038-bfbd-f0bda06de716]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.630 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e1705f-6428-4484-96e1-a8867e7cc0e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.632 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29ccf00e-5fc9-4fca-961e-9ef6f8f78f7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.646 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[518ecdde-e626-4f5b-8fe2-8fe2da5613f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523334, 'reachable_time': 24440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257415, 'error': None, 'target': 'ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.648 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e260ad9-fcf1-432b-b71b-b943d4249b65 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:43:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:18.649 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[830402b6-a4a1-48da-893f-7970c2d7ef87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:18 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3e260ad9\x2dfcf1\x2d432b\x2db71b\x2db943d4249b65.mount: Deactivated successfully.
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.960 225859 INFO nova.virt.libvirt.driver [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Deleting instance files /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_del#033[00m
Jan 20 09:43:18 np0005588919 nova_compute[225855]: 2026-01-20 14:43:18.961 225859 INFO nova.virt.libvirt.driver [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Deletion of /var/lib/nova/instances/cd7f8cdc-1467-4d67-b60b-dd2ee8707b09_del complete#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.032 225859 INFO nova.compute.manager [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.033 225859 DEBUG oslo.service.loopingcall [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.033 225859 DEBUG nova.compute.manager [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.034 225859 DEBUG nova.network.neutron [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.051 225859 DEBUG nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-unplugged-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.051 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.051 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.052 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.052 225859 DEBUG nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] No waiting events found dispatching network-vif-unplugged-221681ce-86ed-410e-8ca2-52951142fede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.053 225859 DEBUG nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-unplugged-221681ce-86ed-410e-8ca2-52951142fede for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.053 225859 DEBUG nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.053 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.054 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.054 225859 DEBUG oslo_concurrency.lockutils [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.055 225859 DEBUG nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] No waiting events found dispatching network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:19 np0005588919 nova_compute[225855]: 2026-01-20 14:43:19.055 225859 WARNING nova.compute.manager [req-cc60b2e4-1f82-4aff-9198-1641d2dd8cf1 req-d5bcb852-3b53-48bf-a499-168961abc425 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received unexpected event network-vif-plugged-221681ce-86ed-410e-8ca2-52951142fede for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:43:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:19.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:20.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.258 225859 DEBUG nova.network.neutron [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.298 225859 INFO nova.compute.manager [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Took 1.26 seconds to deallocate network for instance.#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.378 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.379 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.429 225859 DEBUG oslo_concurrency.processutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.574 225859 DEBUG nova.compute.manager [req-d4c73e9b-d5c8-4879-8e08-5e71b00c48db req-b691813b-509a-4744-9402-04fc9a362da8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Received event network-vif-deleted-221681ce-86ed-410e-8ca2-52951142fede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4067799863' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.870 225859 DEBUG oslo_concurrency.processutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.876 225859 DEBUG nova.compute.provider_tree [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.891 225859 DEBUG nova.scheduler.client.report [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.913 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:20 np0005588919 nova_compute[225855]: 2026-01-20 14:43:20.941 225859 INFO nova.scheduler.client.report [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Deleted allocations for instance cd7f8cdc-1467-4d67-b60b-dd2ee8707b09#033[00m
Jan 20 09:43:21 np0005588919 nova_compute[225855]: 2026-01-20 14:43:21.026 225859 DEBUG oslo_concurrency.lockutils [None req-e46dc073-839f-4cc0-bb74-4d40bbb9f890 aa2e7857e85f483eb0d162e2ee8c2e2c a3e022a35f604df2bbc885e498b1e206 - - default default] Lock "cd7f8cdc-1467-4d67-b60b-dd2ee8707b09" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:21.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:22.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:23 np0005588919 nova_compute[225855]: 2026-01-20 14:43:23.176 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920188.1748006, 504acd93-cd55-496e-a85f-30e811f827d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:23 np0005588919 nova_compute[225855]: 2026-01-20 14:43:23.176 225859 INFO nova.compute.manager [-] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:43:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:23.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:23 np0005588919 nova_compute[225855]: 2026-01-20 14:43:23.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:23 np0005588919 nova_compute[225855]: 2026-01-20 14:43:23.368 225859 DEBUG nova.compute.manager [None req-fc1013a5-077e-4589-bfd4-38e59c3fc6ee - - - - - -] [instance: 504acd93-cd55-496e-a85f-30e811f827d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:23 np0005588919 nova_compute[225855]: 2026-01-20 14:43:23.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:24.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:25.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:25 np0005588919 nova_compute[225855]: 2026-01-20 14:43:25.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:26.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:26 np0005588919 podman[257467]: 2026-01-20 14:43:26.68365389 +0000 UTC m=+0.116277225 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 20 09:43:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:27 np0005588919 nova_compute[225855]: 2026-01-20 14:43:27.372 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:27 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 09:43:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:28.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:28 np0005588919 nova_compute[225855]: 2026-01-20 14:43:28.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:28 np0005588919 nova_compute[225855]: 2026-01-20 14:43:28.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:43:28 np0005588919 nova_compute[225855]: 2026-01-20 14:43:28.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:43:28 np0005588919 nova_compute[225855]: 2026-01-20 14:43:28.358 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:28 np0005588919 nova_compute[225855]: 2026-01-20 14:43:28.581 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:29.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:30.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:30 np0005588919 nova_compute[225855]: 2026-01-20 14:43:30.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:30 np0005588919 nova_compute[225855]: 2026-01-20 14:43:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:31 np0005588919 nova_compute[225855]: 2026-01-20 14:43:31.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:31 np0005588919 nova_compute[225855]: 2026-01-20 14:43:31.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:43:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:32.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.362 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.363 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.364 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/543489734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.800 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.967 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.969 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4550MB free_disk=20.967357635498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.969 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:32 np0005588919 nova_compute[225855]: 2026-01-20 14:43:32.970 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.067 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.067 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:43:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:33.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.515 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920198.514241, cd7f8cdc-1467-4d67-b60b-dd2ee8707b09 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.516 225859 INFO nova.compute.manager [-] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.553 225859 DEBUG nova.compute.manager [None req-24947fce-01c0-43f6-8ce9-6d0a85715a0a - - - - - -] [instance: cd7f8cdc-1467-4d67-b60b-dd2ee8707b09] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.632 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.712 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.713 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.931 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:43:33 np0005588919 nova_compute[225855]: 2026-01-20 14:43:33.982 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.036 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:34.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:43:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/332974038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.863 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.827s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.870 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.885 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.923 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:43:34 np0005588919 nova_compute[225855]: 2026-01-20 14:43:34.923 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:35 np0005588919 podman[257567]: 2026-01-20 14:43:35.009659695 +0000 UTC m=+0.047733002 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:43:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:35.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.381 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.381 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:43:35 np0005588919 nova_compute[225855]: 2026-01-20 14:43:35.382 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:43:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:36.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:37.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.382 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.383 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.404 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.579 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.579 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.585 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.586 225859 INFO nova.compute.claims [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:43:37 np0005588919 nova_compute[225855]: 2026-01-20 14:43:37.838 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:38.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.297 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.304 225859 DEBUG nova.compute.provider_tree [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.342 225859 DEBUG nova.scheduler.client.report [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.394 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.395 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.523 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.524 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.556 225859 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.603 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.800 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.802 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.803 225859 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Creating image(s)#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.831 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.861 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.900 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.906 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.987 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.988 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.989 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:38 np0005588919 nova_compute[225855]: 2026-01-20 14:43:38.989 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.021 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.026 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.094 225859 DEBUG nova.policy [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2975742546164cad937d13671d17108a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28a523cfe06042ff96554913a78e1e3a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:43:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:39.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.290 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.362 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] resizing rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.464 225859 DEBUG nova.objects.instance [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'migration_context' on Instance uuid d6ec5fce-44f2-4c13-b908-c45d7a919b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.483 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.484 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Ensure instance console log exists: /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.484 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.485 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:39 np0005588919 nova_compute[225855]: 2026-01-20 14:43:39.485 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:40.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:40 np0005588919 nova_compute[225855]: 2026-01-20 14:43:40.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:41.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:42.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:42 np0005588919 nova_compute[225855]: 2026-01-20 14:43:42.875 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Successfully created port: 69c1a502-414e-4ca7-9aec-488bbb6170b2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:43:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:43.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:43 np0005588919 nova_compute[225855]: 2026-01-20 14:43:43.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:44.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:45.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.317 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.405 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Successfully updated port: 69c1a502-414e-4ca7-9aec-488bbb6170b2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.433 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.433 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquired lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.434 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.881 225859 DEBUG nova.compute.manager [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-changed-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.882 225859 DEBUG nova.compute.manager [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Refreshing instance network info cache due to event network-changed-69c1a502-414e-4ca7-9aec-488bbb6170b2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.882 225859 DEBUG oslo_concurrency.lockutils [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:43:45 np0005588919 nova_compute[225855]: 2026-01-20 14:43:45.962 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:43:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:46.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:47.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.867 225859 DEBUG nova.network.neutron [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Updating instance_info_cache with network_info: [{"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.895 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Releasing lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.896 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Instance network_info: |[{"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.897 225859 DEBUG oslo_concurrency.lockutils [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.897 225859 DEBUG nova.network.neutron [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Refreshing network info cache for port 69c1a502-414e-4ca7-9aec-488bbb6170b2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.902 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Start _get_guest_xml network_info=[{"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.906 225859 WARNING nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.911 225859 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.912 225859 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.915 225859 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.915 225859 DEBUG nova.virt.libvirt.host [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.917 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.918 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.919 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.919 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.920 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.920 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.921 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.921 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.922 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.922 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.923 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.923 225859 DEBUG nova.virt.hardware [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:43:47 np0005588919 nova_compute[225855]: 2026-01-20 14:43:47.928 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:48.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2437304830' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.440 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.469 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.474 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.591 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:43:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3926692694' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.958 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.960 225859 DEBUG nova.virt.libvirt.vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-2',id=84,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:38Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=d6ec5fce-44f2-4c13-b908-c45d7a919b34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.961 225859 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.962 225859 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.963 225859 DEBUG nova.objects.instance [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'pci_devices' on Instance uuid d6ec5fce-44f2-4c13-b908-c45d7a919b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:43:48 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.996 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <uuid>d6ec5fce-44f2-4c13-b908-c45d7a919b34</uuid>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <name>instance-00000054</name>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1818827013-2</nova:name>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:43:47</nova:creationTime>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:user uuid="2975742546164cad937d13671d17108a">tempest-ListServersNegativeTestJSON-1080060493-project-member</nova:user>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:project uuid="28a523cfe06042ff96554913a78e1e3a">tempest-ListServersNegativeTestJSON-1080060493</nova:project>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <nova:port uuid="69c1a502-414e-4ca7-9aec-488bbb6170b2">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="serial">d6ec5fce-44f2-4c13-b908-c45d7a919b34</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="uuid">d6ec5fce-44f2-4c13-b908-c45d7a919b34</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:18:6f:89"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:43:48 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <target dev="tap69c1a502-41"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/console.log" append="off"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:43:49 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:43:49 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:43:49 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:43:49 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.997 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Preparing to wait for external event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.998 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.998 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.998 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.999 225859 DEBUG nova.virt.libvirt.vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-2',id=84,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:43:38Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=d6ec5fce-44f2-4c13-b908-c45d7a919b34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:48.999 225859 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.000 225859 DEBUG nova.network.os_vif_util [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.000 225859 DEBUG os_vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.001 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.001 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.005 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69c1a502-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.006 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69c1a502-41, col_values=(('external_ids', {'iface-id': '69c1a502-414e-4ca7-9aec-488bbb6170b2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:6f:89', 'vm-uuid': 'd6ec5fce-44f2-4c13-b908-c45d7a919b34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:49 np0005588919 NetworkManager[49104]: <info>  [1768920229.0086] manager: (tap69c1a502-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.015 225859 INFO os_vif [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41')#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.137 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.138 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.138 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] No VIF found with MAC fa:16:3e:18:6f:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.138 225859 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Using config drive#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.161 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:49.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.881 225859 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Creating config drive at /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config#033[00m
Jan 20 09:43:49 np0005588919 nova_compute[225855]: 2026-01-20 14:43:49.885 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3e9ufv0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.018 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3e9ufv0" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.050 225859 DEBUG nova.storage.rbd_utils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] rbd image d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.054 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.241 225859 DEBUG oslo_concurrency.processutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config d6ec5fce-44f2-4c13-b908-c45d7a919b34_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.242 225859 INFO nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Deleting local config drive /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34/disk.config because it was imported into RBD.#033[00m
Jan 20 09:43:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:50.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:50 np0005588919 kernel: tap69c1a502-41: entered promiscuous mode
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.2968] manager: (tap69c1a502-41): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:50Z|00291|binding|INFO|Claiming lport 69c1a502-414e-4ca7-9aec-488bbb6170b2 for this chassis.
Jan 20 09:43:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:50Z|00292|binding|INFO|69c1a502-414e-4ca7-9aec-488bbb6170b2: Claiming fa:16:3e:18:6f:89 10.100.0.10
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.301 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.319 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:6f:89 10.100.0.10'], port_security=['fa:16:3e:18:6f:89 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd6ec5fce-44f2-4c13-b908-c45d7a919b34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53d0b281-776f-4682-8aaf-098e1d364008', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28a523cfe06042ff96554913a78e1e3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1879c269-0854-40a3-8eb9-b61f97d38545', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1afefec-2060-4dfb-acbb-1ce14c3a663c, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=69c1a502-414e-4ca7-9aec-488bbb6170b2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.320 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 69c1a502-414e-4ca7-9aec-488bbb6170b2 in datapath 53d0b281-776f-4682-8aaf-098e1d364008 bound to our chassis#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.322 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 53d0b281-776f-4682-8aaf-098e1d364008#033[00m
Jan 20 09:43:50 np0005588919 systemd-udevd[257965]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.332 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa2f397-31d3-4423-b7ca-36604ceb07ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.333 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap53d0b281-71 in ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.335 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap53d0b281-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.335 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[47b0d722-1977-499e-87e2-67934a58fe0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.336 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3825b9-9fb0-4f7a-bac5-9a3b1852ad87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 systemd-machined[194361]: New machine qemu-35-instance-00000054.
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.3387] device (tap69c1a502-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.3393] device (tap69c1a502-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.346 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[67421eb7-c009-4442-992b-c5a3f717f187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 systemd[1]: Started Virtual Machine qemu-35-instance-00000054.
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.371 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0586ac9-b515-42f4-8d80-cb32f88db466]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.374 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:50Z|00293|binding|INFO|Setting lport 69c1a502-414e-4ca7-9aec-488bbb6170b2 ovn-installed in OVS
Jan 20 09:43:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:50Z|00294|binding|INFO|Setting lport 69c1a502-414e-4ca7-9aec-488bbb6170b2 up in Southbound
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.381 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.400 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[af4c8832-24bd-4a9a-8ef7-cffee8933624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.404 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cd924935-f41d-45dc-aebc-f94f56ef5fc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.4061] manager: (tap53d0b281-70): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.434 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[42fe7609-6a4c-4fc0-a512-99d1c417a584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.437 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[05c5c49b-0c0c-48ab-a93b-e7dfd85c8c85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.4573] device (tap53d0b281-70): carrier: link connected
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.462 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[63f18615-131a-41fa-bb7b-ac79fead64ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.479 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf33232-2a7b-4c50-bf69-aa308c312688]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53d0b281-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:be:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526938, 'reachable_time': 33076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258000, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.494 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4425525b-3a32-4538-bcf3-fd6b0eab10cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:befe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526938, 'tstamp': 526938}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258001, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.510 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0f0a20-1a82-4ae8-8643-a885fc48d301]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap53d0b281-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:be:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526938, 'reachable_time': 33076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258002, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.536 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[23cc263e-12b8-43a1-8416-2cd513ceaed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.592 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d143ca0e-1e7c-49e1-99f5-3d6556721135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.593 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d0b281-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.593 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.593 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap53d0b281-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:50 np0005588919 kernel: tap53d0b281-70: entered promiscuous mode
Jan 20 09:43:50 np0005588919 NetworkManager[49104]: <info>  [1768920230.5961] manager: (tap53d0b281-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.595 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.597 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap53d0b281-70, col_values=(('external_ids', {'iface-id': '2ea34810-4753-414f-ae43-b7b379fc432c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:43:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:43:50Z|00295|binding|INFO|Releasing lport 2ea34810-4753-414f-ae43-b7b379fc432c from this chassis (sb_readonly=0)
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.600 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.600 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[94a727af-fcc8-4d4a-910c-22db41319686]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.601 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-53d0b281-776f-4682-8aaf-098e1d364008
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/53d0b281-776f-4682-8aaf-098e1d364008.pid.haproxy
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 53d0b281-776f-4682-8aaf-098e1d364008
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:43:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:43:50.602 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'env', 'PROCESS_TAG=haproxy-53d0b281-776f-4682-8aaf-098e1d364008', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/53d0b281-776f-4682-8aaf-098e1d364008.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.614 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.728 225859 DEBUG nova.compute.manager [req-32e2c550-f1f1-4bd4-99cd-c920efb2609d req-1b5073b9-ee96-466f-9e4a-7d0c87398e76 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.729 225859 DEBUG oslo_concurrency.lockutils [req-32e2c550-f1f1-4bd4-99cd-c920efb2609d req-1b5073b9-ee96-466f-9e4a-7d0c87398e76 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.729 225859 DEBUG oslo_concurrency.lockutils [req-32e2c550-f1f1-4bd4-99cd-c920efb2609d req-1b5073b9-ee96-466f-9e4a-7d0c87398e76 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.729 225859 DEBUG oslo_concurrency.lockutils [req-32e2c550-f1f1-4bd4-99cd-c920efb2609d req-1b5073b9-ee96-466f-9e4a-7d0c87398e76 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.730 225859 DEBUG nova.compute.manager [req-32e2c550-f1f1-4bd4-99cd-c920efb2609d req-1b5073b9-ee96-466f-9e4a-7d0c87398e76 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Processing event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.753 225859 DEBUG nova.network.neutron [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Updated VIF entry in instance network info cache for port 69c1a502-414e-4ca7-9aec-488bbb6170b2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.753 225859 DEBUG nova.network.neutron [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Updating instance_info_cache with network_info: [{"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.783 225859 DEBUG oslo_concurrency.lockutils [req-fd821a2b-dde3-404f-ae5f-005f57bcf5e6 req-c9a1bcdb-f1c7-4d63-b82e-af793ac06f1a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d6ec5fce-44f2-4c13-b908-c45d7a919b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:43:50 np0005588919 podman[258073]: 2026-01-20 14:43:50.978365094 +0000 UTC m=+0.052153948 container create 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.983 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.984 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920230.982973, d6ec5fce-44f2-4c13-b908-c45d7a919b34 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.984 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] VM Started (Lifecycle Event)#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.987 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.991 225859 INFO nova.virt.libvirt.driver [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Instance spawned successfully.#033[00m
Jan 20 09:43:50 np0005588919 nova_compute[225855]: 2026-01-20 14:43:50.991 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.009 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.015 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:51 np0005588919 systemd[1]: Started libpod-conmon-5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc.scope.
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.020 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.020 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.021 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.021 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.021 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.022 225859 DEBUG nova.virt.libvirt.driver [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:43:51 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.047 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.048 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920230.983072, d6ec5fce-44f2-4c13-b908-c45d7a919b34 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.048 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:43:51 np0005588919 podman[258073]: 2026-01-20 14:43:50.953184601 +0000 UTC m=+0.026973465 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:43:51 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6f7c08597faeeb0c37af41a23da75f840d40edad83ad99c2c657cb81c506427/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:43:51 np0005588919 podman[258073]: 2026-01-20 14:43:51.066431328 +0000 UTC m=+0.140220182 container init 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:43:51 np0005588919 podman[258073]: 2026-01-20 14:43:51.072678875 +0000 UTC m=+0.146467719 container start 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.076 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.080 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920230.9865382, d6ec5fce-44f2-4c13-b908-c45d7a919b34 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.080 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.096 225859 INFO nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Took 12.30 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.096 225859 DEBUG nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:51 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [NOTICE]   (258094) : New worker (258096) forked
Jan 20 09:43:51 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [NOTICE]   (258094) : Loading success.
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.122 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.126 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.151 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.167 225859 INFO nova.compute.manager [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Took 13.66 seconds to build instance.#033[00m
Jan 20 09:43:51 np0005588919 nova_compute[225855]: 2026-01-20 14:43:51.189 225859 DEBUG oslo_concurrency.lockutils [None req-8c56c911-dd71-4ce2-be61-541eba597933 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:51.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:52.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.891 225859 DEBUG nova.compute.manager [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.891 225859 DEBUG oslo_concurrency.lockutils [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.892 225859 DEBUG oslo_concurrency.lockutils [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.892 225859 DEBUG oslo_concurrency.lockutils [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.892 225859 DEBUG nova.compute.manager [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] No waiting events found dispatching network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:43:52 np0005588919 nova_compute[225855]: 2026-01-20 14:43:52.892 225859 WARNING nova.compute.manager [req-1bfb8fcc-8dc2-4bc7-abd3-b75f14c363f4 req-bdd16724-d118-45c3-bcc5-83a9c832b3d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received unexpected event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:43:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:53.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:54 np0005588919 nova_compute[225855]: 2026-01-20 14:43:54.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:54.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:55.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:43:55 np0005588919 nova_compute[225855]: 2026-01-20 14:43:55.376 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.004000113s ======
Jan 20 09:43:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:56.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000113s
Jan 20 09:43:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:43:57 np0005588919 podman[258227]: 2026-01-20 14:43:57.037977192 +0000 UTC m=+0.087743256 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:43:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:43:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:57.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:43:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:43:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:43:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:43:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:43:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:43:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:43:59 np0005588919 nova_compute[225855]: 2026-01-20 14:43:59.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:43:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:43:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:43:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:43:59.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:00 np0005588919 nova_compute[225855]: 2026-01-20 14:44:00.378 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.013 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.014 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.016 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.017 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.017 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.018 225859 INFO nova.compute.manager [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Terminating instance#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.019 225859 DEBUG nova.compute.manager [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:44:01 np0005588919 kernel: tap69c1a502-41 (unregistering): left promiscuous mode
Jan 20 09:44:01 np0005588919 NetworkManager[49104]: <info>  [1768920241.0617] device (tap69c1a502-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:44:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:01Z|00296|binding|INFO|Releasing lport 69c1a502-414e-4ca7-9aec-488bbb6170b2 from this chassis (sb_readonly=0)
Jan 20 09:44:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:01Z|00297|binding|INFO|Setting lport 69c1a502-414e-4ca7-9aec-488bbb6170b2 down in Southbound
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:01Z|00298|binding|INFO|Removing iface tap69c1a502-41 ovn-installed in OVS
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.077 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:6f:89 10.100.0.10'], port_security=['fa:16:3e:18:6f:89 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd6ec5fce-44f2-4c13-b908-c45d7a919b34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53d0b281-776f-4682-8aaf-098e1d364008', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28a523cfe06042ff96554913a78e1e3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1879c269-0854-40a3-8eb9-b61f97d38545', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1afefec-2060-4dfb-acbb-1ce14c3a663c, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=69c1a502-414e-4ca7-9aec-488bbb6170b2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.078 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 69c1a502-414e-4ca7-9aec-488bbb6170b2 in datapath 53d0b281-776f-4682-8aaf-098e1d364008 unbound from our chassis#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.079 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 53d0b281-776f-4682-8aaf-098e1d364008, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.080 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[31fea395-25b9-40f5-b0d0-98be95dc350f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.081 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 namespace which is not needed anymore#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 20 09:44:01 np0005588919 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000054.scope: Consumed 10.931s CPU time.
Jan 20 09:44:01 np0005588919 systemd-machined[194361]: Machine qemu-35-instance-00000054 terminated.
Jan 20 09:44:01 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [NOTICE]   (258094) : haproxy version is 2.8.14-c23fe91
Jan 20 09:44:01 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [NOTICE]   (258094) : path to executable is /usr/sbin/haproxy
Jan 20 09:44:01 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [WARNING]  (258094) : Exiting Master process...
Jan 20 09:44:01 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [ALERT]    (258094) : Current worker (258096) exited with code 143 (Terminated)
Jan 20 09:44:01 np0005588919 neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008[258090]: [WARNING]  (258094) : All workers exited. Exiting... (0)
Jan 20 09:44:01 np0005588919 systemd[1]: libpod-5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc.scope: Deactivated successfully.
Jan 20 09:44:01 np0005588919 conmon[258090]: conmon 5ede8f98a3dd4d68b1eb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc.scope/container/memory.events
Jan 20 09:44:01 np0005588919 podman[258292]: 2026-01-20 14:44:01.211113762 +0000 UTC m=+0.040428836 container died 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:44:01 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc-userdata-shm.mount: Deactivated successfully.
Jan 20 09:44:01 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a6f7c08597faeeb0c37af41a23da75f840d40edad83ad99c2c657cb81c506427-merged.mount: Deactivated successfully.
Jan 20 09:44:01 np0005588919 podman[258292]: 2026-01-20 14:44:01.254407868 +0000 UTC m=+0.083722942 container cleanup 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.258 225859 INFO nova.virt.libvirt.driver [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Instance destroyed successfully.#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.258 225859 DEBUG nova.objects.instance [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lazy-loading 'resources' on Instance uuid d6ec5fce-44f2-4c13-b908-c45d7a919b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:01.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:01 np0005588919 systemd[1]: libpod-conmon-5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc.scope: Deactivated successfully.
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.278 225859 DEBUG nova.virt.libvirt.vif [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1818827013',display_name='tempest-ListServersNegativeTestJSON-server-1818827013-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1818827013-2',id=84,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-20T14:43:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='28a523cfe06042ff96554913a78e1e3a',ramdisk_id='',reservation_id='r-s8qopxwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1080060493',owner_user_name='tempest-ListServersNegativeTestJSON-1080060493-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:43:51Z,user_data=None,user_id='2975742546164cad937d13671d17108a',uuid=d6ec5fce-44f2-4c13-b908-c45d7a919b34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.280 225859 DEBUG nova.network.os_vif_util [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converting VIF {"id": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "address": "fa:16:3e:18:6f:89", "network": {"id": "53d0b281-776f-4682-8aaf-098e1d364008", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1516883251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "28a523cfe06042ff96554913a78e1e3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69c1a502-41", "ovs_interfaceid": "69c1a502-414e-4ca7-9aec-488bbb6170b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.280 225859 DEBUG nova.network.os_vif_util [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.281 225859 DEBUG os_vif [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.284 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.285 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69c1a502-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.291 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.293 225859 INFO os_vif [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:6f:89,bridge_name='br-int',has_traffic_filtering=True,id=69c1a502-414e-4ca7-9aec-488bbb6170b2,network=Network(53d0b281-776f-4682-8aaf-098e1d364008),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69c1a502-41')#033[00m
Jan 20 09:44:01 np0005588919 podman[258331]: 2026-01-20 14:44:01.321043685 +0000 UTC m=+0.044545943 container remove 5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.329 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34d9f01c-c9f7-4d6f-8969-14b3f41e9168]: (4, ('Tue Jan 20 02:44:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 (5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc)\n5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc\nTue Jan 20 02:44:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 (5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc)\n5ede8f98a3dd4d68b1eb28fd75045c72977b65a6797a05a54d3264bb3badb1cc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.330 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[735c2946-ce24-4ac3-aad8-7ce2a1a3b2f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.331 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap53d0b281-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.332 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 kernel: tap53d0b281-70: left promiscuous mode
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.337 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc09e1d-487c-4847-9557-189bc9426673]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.349 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.356 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8f269b-cb76-4684-8921-9cac753634cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.357 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6765ce-82f9-40e2-97d7-2663c8641ce4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.374 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff1135a-aae0-4242-b614-5d80c8284026]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526932, 'reachable_time': 39512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258366, 'error': None, 'target': 'ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 systemd[1]: run-netns-ovnmeta\x2d53d0b281\x2d776f\x2d4682\x2d8aaf\x2d098e1d364008.mount: Deactivated successfully.
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.379 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-53d0b281-776f-4682-8aaf-098e1d364008 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:44:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:01.379 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[4520dfc0-f3af-43db-857d-fbf393637f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.611 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.613 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.648 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.664 225859 INFO nova.virt.libvirt.driver [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Deleting instance files /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34_del#033[00m
Jan 20 09:44:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.665 225859 INFO nova.virt.libvirt.driver [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Deletion of /var/lib/nova/instances/d6ec5fce-44f2-4c13-b908-c45d7a919b34_del complete#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.793 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.793 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.800 225859 INFO nova.compute.manager [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.801 225859 DEBUG oslo.service.loopingcall [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.802 225859 DEBUG nova.compute.manager [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.802 225859 DEBUG nova.network.neutron [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.809 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:44:01 np0005588919 nova_compute[225855]: 2026-01-20 14:44:01.810 225859 INFO nova.compute.claims [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.019 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:02.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/14759032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.479 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.485 225859 DEBUG nova.compute.provider_tree [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.505 225859 DEBUG nova.scheduler.client.report [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.557 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.558 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.626 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.627 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.653 225859 INFO nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.677 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.876 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.877 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.877 225859 INFO nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Creating image(s)#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.901 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.931 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.954 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.959 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:02 np0005588919 nova_compute[225855]: 2026-01-20 14:44:02.983 225859 DEBUG nova.policy [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '869086208e10436c9dc96c78bee9a85d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b683fcc0026242e28ba6d8fba638688e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.018 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.019 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.020 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.020 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.042 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.045 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:44:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:44:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:03.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.352 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.432 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.536 225859 DEBUG nova.objects.instance [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.546 225859 DEBUG nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-unplugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.547 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.547 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.547 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.548 225859 DEBUG nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] No waiting events found dispatching network-vif-unplugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.548 225859 DEBUG nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-unplugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.548 225859 DEBUG nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.549 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.549 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.550 225859 DEBUG oslo_concurrency.lockutils [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.550 225859 DEBUG nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] No waiting events found dispatching network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.550 225859 WARNING nova.compute.manager [req-a0cbf06a-07e3-46cf-9223-2a05eaa64b0f req-8f0650d1-749d-4c83-8edf-c124e6abdbb0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received unexpected event network-vif-plugged-69c1a502-414e-4ca7-9aec-488bbb6170b2 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.564 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.564 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Ensure instance console log exists: /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.565 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.565 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.565 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.851 225859 DEBUG nova.network.neutron [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.877 225859 INFO nova.compute.manager [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Took 2.07 seconds to deallocate network for instance.#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.947 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:03 np0005588919 nova_compute[225855]: 2026-01-20 14:44:03.948 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.023 225859 DEBUG oslo_concurrency.processutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.199 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Successfully created port: 2c289e6f-295e-44c3-948a-9a6901251890 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:44:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:04.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3605756288' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.490 225859 DEBUG oslo_concurrency.processutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.496 225859 DEBUG nova.compute.provider_tree [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.545 225859 DEBUG nova.scheduler.client.report [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.586 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.615 225859 INFO nova.scheduler.client.report [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Deleted allocations for instance d6ec5fce-44f2-4c13-b908-c45d7a919b34#033[00m
Jan 20 09:44:04 np0005588919 nova_compute[225855]: 2026-01-20 14:44:04.746 225859 DEBUG oslo_concurrency.lockutils [None req-e1bb863c-b070-41d6-9c26-36531271f9be 2975742546164cad937d13671d17108a 28a523cfe06042ff96554913a78e1e3a - - default default] Lock "d6ec5fce-44f2-4c13-b908-c45d7a919b34" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:05.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.562 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Successfully updated port: 2c289e6f-295e-44c3-948a-9a6901251890 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.580 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.580 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.580 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.693 225859 DEBUG nova.compute.manager [req-c48853ba-bfb6-4c05-9397-cd05484a477f req-0518c30a-3877-45fc-bdd6-c8376618099f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Received event network-vif-deleted-69c1a502-414e-4ca7-9aec-488bbb6170b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.710 225859 DEBUG nova.compute.manager [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-changed-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.710 225859 DEBUG nova.compute.manager [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing instance network info cache due to event network-changed-2c289e6f-295e-44c3-948a-9a6901251890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.710 225859 DEBUG oslo_concurrency.lockutils [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:05 np0005588919 nova_compute[225855]: 2026-01-20 14:44:05.811 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:44:06 np0005588919 podman[258631]: 2026-01-20 14:44:06.006586896 +0000 UTC m=+0.053131295 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:44:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:06.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:06 np0005588919 nova_compute[225855]: 2026-01-20 14:44:06.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.183 225859 DEBUG nova.network.neutron [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.217 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.217 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Instance network_info: |[{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.218 225859 DEBUG oslo_concurrency.lockutils [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.218 225859 DEBUG nova.network.neutron [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.221 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Start _get_guest_xml network_info=[{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.226 225859 WARNING nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.231 225859 DEBUG nova.virt.libvirt.host [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.232 225859 DEBUG nova.virt.libvirt.host [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.236 225859 DEBUG nova.virt.libvirt.host [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.237 225859 DEBUG nova.virt.libvirt.host [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.238 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.239 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.239 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.240 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.240 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.240 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.240 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.241 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.241 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.241 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.242 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.242 225859 DEBUG nova.virt.hardware [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.246 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:07.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/847151595' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.679 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.705 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:07 np0005588919 nova_compute[225855]: 2026-01-20 14:44:07.709 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/674612077' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.139 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.141 225859 DEBUG nova.virt.libvirt.vif [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1533521351',display_name='tempest-ServerActionsTestOtherA-server-1533521351',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1533521351',id=87,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-47bmn591',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=6586bc3e-3a94-4d22-8e8c-713a86a956fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.141 225859 DEBUG nova.network.os_vif_util [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.142 225859 DEBUG nova.network.os_vif_util [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.144 225859 DEBUG nova.objects.instance [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.166 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <uuid>6586bc3e-3a94-4d22-8e8c-713a86a956fb</uuid>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <name>instance-00000057</name>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherA-server-1533521351</nova:name>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:44:07</nova:creationTime>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <nova:port uuid="2c289e6f-295e-44c3-948a-9a6901251890">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="serial">6586bc3e-3a94-4d22-8e8c-713a86a956fb</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="uuid">6586bc3e-3a94-4d22-8e8c-713a86a956fb</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:2f:4c:e2"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <target dev="tap2c289e6f-29"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/console.log" append="off"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:44:08 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:44:08 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:44:08 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:44:08 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.167 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Preparing to wait for external event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.168 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.168 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.168 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.169 225859 DEBUG nova.virt.libvirt.vif [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1533521351',display_name='tempest-ServerActionsTestOtherA-server-1533521351',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1533521351',id=87,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-47bmn591',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=6586bc3e-3a94-4d22-8e8c-713a86a956fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.170 225859 DEBUG nova.network.os_vif_util [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.170 225859 DEBUG nova.network.os_vif_util [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.171 225859 DEBUG os_vif [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.171 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.172 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.173 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.176 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.176 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c289e6f-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.176 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c289e6f-29, col_values=(('external_ids', {'iface-id': '2c289e6f-295e-44c3-948a-9a6901251890', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:4c:e2', 'vm-uuid': '6586bc3e-3a94-4d22-8e8c-713a86a956fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.178 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:08 np0005588919 NetworkManager[49104]: <info>  [1768920248.1797] manager: (tap2c289e6f-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.181 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.185 225859 INFO os_vif [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29')#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.250 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.250 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.250 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:2f:4c:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.251 225859 INFO nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Using config drive#033[00m
Jan 20 09:44:08 np0005588919 nova_compute[225855]: 2026-01-20 14:44:08.270 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:08.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.564 225859 INFO nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Creating config drive at /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.572 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2terrax execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.700 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2terrax" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.726 225859 DEBUG nova.storage.rbd_utils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.730 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.960 225859 DEBUG nova.network.neutron [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated VIF entry in instance network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.961 225859 DEBUG nova.network.neutron [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:09 np0005588919 nova_compute[225855]: 2026-01-20 14:44:09.980 225859 DEBUG oslo_concurrency.lockutils [req-3e18d8ff-e118-4a17-aedd-174147fa8366 req-c71917d1-60b1-4698-8cf6-1d8e6757bf49 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:10.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:10 np0005588919 nova_compute[225855]: 2026-01-20 14:44:10.382 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:11.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.690 225859 DEBUG oslo_concurrency.processutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config 6586bc3e-3a94-4d22-8e8c-713a86a956fb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.960s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.691 225859 INFO nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Deleting local config drive /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:11 np0005588919 kernel: tap2c289e6f-29: entered promiscuous mode
Jan 20 09:44:11 np0005588919 NetworkManager[49104]: <info>  [1768920251.7581] manager: (tap2c289e6f-29): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Jan 20 09:44:11 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:11Z|00299|binding|INFO|Claiming lport 2c289e6f-295e-44c3-948a-9a6901251890 for this chassis.
Jan 20 09:44:11 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:11Z|00300|binding|INFO|2c289e6f-295e-44c3-948a-9a6901251890: Claiming fa:16:3e:2f:4c:e2 10.100.0.9
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.760 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.824 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.825 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:4c:e2 10.100.0.9'], port_security=['fa:16:3e:2f:4c:e2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6586bc3e-3a94-4d22-8e8c-713a86a956fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2c289e6f-295e-44c3-948a-9a6901251890) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.827 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2c289e6f-295e-44c3-948a-9a6901251890 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.828 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:44:11 np0005588919 systemd-udevd[258839]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.840 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbd08c7-b10c-4e08-837b-8a54abf77152]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.841 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa19e9d1a-81 in ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.843 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa19e9d1a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.843 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[634b4beb-9707-4510-84c6-7a20de7c3570]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.844 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[60b168f9-17ca-449f-9cb7-d01f01e9fa30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 systemd-machined[194361]: New machine qemu-36-instance-00000057.
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.857 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d21090-ccee-4890-a967-1503a13944a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 NetworkManager[49104]: <info>  [1768920251.8593] device (tap2c289e6f-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:11 np0005588919 NetworkManager[49104]: <info>  [1768920251.8601] device (tap2c289e6f-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:11 np0005588919 systemd[1]: Started Virtual Machine qemu-36-instance-00000057.
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.883 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b761c1e-20dc-47f4-a613-bd469f6e1942]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.894 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:11Z|00301|binding|INFO|Setting lport 2c289e6f-295e-44c3-948a-9a6901251890 ovn-installed in OVS
Jan 20 09:44:11 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:11Z|00302|binding|INFO|Setting lport 2c289e6f-295e-44c3-948a-9a6901251890 up in Southbound
Jan 20 09:44:11 np0005588919 nova_compute[225855]: 2026-01-20 14:44:11.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.910 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b52ce737-57b3-430d-8928-26917f27cf9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.916 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[91e89558-f29f-4e14-94dd-2b036660b316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 NetworkManager[49104]: <info>  [1768920251.9175] manager: (tapa19e9d1a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.948 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e524ce8b-3406-470c-894a-aea5e8462d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.951 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0641c8e4-f6b1-4944-a4f9-721e4d2c53d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:11 np0005588919 NetworkManager[49104]: <info>  [1768920251.9790] device (tapa19e9d1a-80): carrier: link connected
Jan 20 09:44:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:11.984 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c6bda9f1-4d65-41c1-83a0-4c3d63c5eb96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.001 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f1f05f-dba8-413a-8afd-46d6039c86c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 41331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258873, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.015 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[be399fa3-d1d0-4d6b-a8f7-8a3d646bd0d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5313'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529090, 'tstamp': 529090}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 258874, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.031 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0b6fee-a76a-406a-b92c-6b2bfb703b40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 41331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 258875, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.059 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d59a8cb4-9681-4ce2-9c2e-5c5541dce7a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.119 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e54e7b4f-a4cd-47a5-9b4b-93297dadacf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.120 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.120 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.121 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:12 np0005588919 NetworkManager[49104]: <info>  [1768920252.1236] manager: (tapa19e9d1a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 20 09:44:12 np0005588919 kernel: tapa19e9d1a-80: entered promiscuous mode
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.124 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:12Z|00303|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.140 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.142 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.142 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3244e050-4df3-4468-9102-7e74193866bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.143 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/a19e9d1a-864f-41ee-bdea-188e65973ea5.pid.haproxy
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID a19e9d1a-864f-41ee-bdea-188e65973ea5
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:12.144 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'env', 'PROCESS_TAG=haproxy-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a19e9d1a-864f-41ee-bdea-188e65973ea5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:12.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.527 225859 DEBUG nova.compute.manager [req-9c4ec8cc-1b35-4a18-9bbe-da38e7223dd9 req-07ca47b1-b8d1-400c-9619-c0f6a4439590 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.528 225859 DEBUG oslo_concurrency.lockutils [req-9c4ec8cc-1b35-4a18-9bbe-da38e7223dd9 req-07ca47b1-b8d1-400c-9619-c0f6a4439590 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.528 225859 DEBUG oslo_concurrency.lockutils [req-9c4ec8cc-1b35-4a18-9bbe-da38e7223dd9 req-07ca47b1-b8d1-400c-9619-c0f6a4439590 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.529 225859 DEBUG oslo_concurrency.lockutils [req-9c4ec8cc-1b35-4a18-9bbe-da38e7223dd9 req-07ca47b1-b8d1-400c-9619-c0f6a4439590 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.529 225859 DEBUG nova.compute.manager [req-9c4ec8cc-1b35-4a18-9bbe-da38e7223dd9 req-07ca47b1-b8d1-400c-9619-c0f6a4439590 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Processing event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:12 np0005588919 podman[258925]: 2026-01-20 14:44:12.533678805 +0000 UTC m=+0.094321392 container create c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:44:12 np0005588919 podman[258925]: 2026-01-20 14:44:12.466256596 +0000 UTC m=+0.026899253 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:12 np0005588919 systemd[1]: Started libpod-conmon-c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8.scope.
Jan 20 09:44:12 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:44:12 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6713c84d9e47876707c1459896eab206f8324b8157aad0c912932cf9613e3a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:12 np0005588919 podman[258925]: 2026-01-20 14:44:12.63128043 +0000 UTC m=+0.191923057 container init c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.634 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920252.6341944, 6586bc3e-3a94-4d22-8e8c-713a86a956fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.635 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.636 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:12 np0005588919 podman[258925]: 2026-01-20 14:44:12.642280791 +0000 UTC m=+0.202923378 container start c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.643 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.647 225859 INFO nova.virt.libvirt.driver [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Instance spawned successfully.#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.647 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.656 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.659 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:12 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [NOTICE]   (258968) : New worker (258970) forked
Jan 20 09:44:12 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [NOTICE]   (258968) : Loading success.
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.690 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.690 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920252.6343024, 6586bc3e-3a94-4d22-8e8c-713a86a956fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.690 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.694 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.694 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.695 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.696 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.696 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.697 225859 DEBUG nova.virt.libvirt.driver [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.741 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.745 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920252.6387117, 6586bc3e-3a94-4d22-8e8c-713a86a956fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.745 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.771 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.775 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.804 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.820 225859 INFO nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Took 9.94 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.820 225859 DEBUG nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.895 225859 INFO nova.compute.manager [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Took 11.15 seconds to build instance.#033[00m
Jan 20 09:44:12 np0005588919 nova_compute[225855]: 2026-01-20 14:44:12.914 225859 DEBUG oslo_concurrency.lockutils [None req-dc51a782-12d2-48be-9a64-fcce82be7137 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:13 np0005588919 nova_compute[225855]: 2026-01-20 14:44:13.179 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:13.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:14.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.674 225859 DEBUG nova.compute.manager [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.674 225859 DEBUG oslo_concurrency.lockutils [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.675 225859 DEBUG oslo_concurrency.lockutils [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.675 225859 DEBUG oslo_concurrency.lockutils [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.675 225859 DEBUG nova.compute.manager [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] No waiting events found dispatching network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:14 np0005588919 nova_compute[225855]: 2026-01-20 14:44:14.675 225859 WARNING nova.compute.manager [req-3f2c7451-045c-41b5-80f6-1f9035c07593 req-d951c8d7-557a-425f-828c-e4d29d10eb1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received unexpected event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:44:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:15.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:44:15 np0005588919 nova_compute[225855]: 2026-01-20 14:44:15.385 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.257 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920241.2555792, d6ec5fce-44f2-4c13-b908-c45d7a919b34 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.257 225859 INFO nova.compute.manager [-] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.278 225859 DEBUG nova.compute.manager [None req-b4015f09-9d59-4aca-aaad-bcc6c5c30ff6 - - - - - -] [instance: d6ec5fce-44f2-4c13-b908-c45d7a919b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:16.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:16.405 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:16.406 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:16.407 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.614 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588919 NetworkManager[49104]: <info>  [1768920256.6153] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 20 09:44:16 np0005588919 NetworkManager[49104]: <info>  [1768920256.6186] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.654 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:16Z|00304|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.663 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:16 np0005588919 nova_compute[225855]: 2026-01-20 14:44:16.950 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:16.951 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:16.952 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:44:17 np0005588919 nova_compute[225855]: 2026-01-20 14:44:17.052 225859 DEBUG nova.compute.manager [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-changed-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:17 np0005588919 nova_compute[225855]: 2026-01-20 14:44:17.053 225859 DEBUG nova.compute.manager [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing instance network info cache due to event network-changed-2c289e6f-295e-44c3-948a-9a6901251890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:17 np0005588919 nova_compute[225855]: 2026-01-20 14:44:17.054 225859 DEBUG oslo_concurrency.lockutils [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:17 np0005588919 nova_compute[225855]: 2026-01-20 14:44:17.054 225859 DEBUG oslo_concurrency.lockutils [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:17 np0005588919 nova_compute[225855]: 2026-01-20 14:44:17.055 225859 DEBUG nova.network.neutron [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:17.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:18 np0005588919 nova_compute[225855]: 2026-01-20 14:44:18.181 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:18.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:19.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:19 np0005588919 nova_compute[225855]: 2026-01-20 14:44:19.304 225859 DEBUG nova.network.neutron [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated VIF entry in instance network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:19 np0005588919 nova_compute[225855]: 2026-01-20 14:44:19.305 225859 DEBUG nova.network.neutron [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:19 np0005588919 nova_compute[225855]: 2026-01-20 14:44:19.339 225859 DEBUG oslo_concurrency.lockutils [req-23231e28-fd47-4a6c-b281-2588441c8457 req-3ccdfe15-dd36-4f5f-9511-82b526feb4f3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:20.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:20 np0005588919 nova_compute[225855]: 2026-01-20 14:44:20.387 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:20 np0005588919 nova_compute[225855]: 2026-01-20 14:44:20.612 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:20.955 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:21.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:21 np0005588919 nova_compute[225855]: 2026-01-20 14:44:21.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:22.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:23 np0005588919 nova_compute[225855]: 2026-01-20 14:44:23.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:23.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:24.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:24 np0005588919 nova_compute[225855]: 2026-01-20 14:44:24.593 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:24Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:4c:e2 10.100.0.9
Jan 20 09:44:24 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:24Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:4c:e2 10.100.0.9
Jan 20 09:44:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:25.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:25 np0005588919 nova_compute[225855]: 2026-01-20 14:44:25.389 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:26.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:26 np0005588919 nova_compute[225855]: 2026-01-20 14:44:26.989 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:27 np0005588919 podman[259036]: 2026-01-20 14:44:27.194897774 +0000 UTC m=+0.085174334 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 20 09:44:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:27.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:28.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.411 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.412 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.412 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.641 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.641 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.642 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:44:28 np0005588919 nova_compute[225855]: 2026-01-20 14:44:28.642 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:29.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:29 np0005588919 nova_compute[225855]: 2026-01-20 14:44:29.973 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:29 np0005588919 nova_compute[225855]: 2026-01-20 14:44:29.987 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:29 np0005588919 nova_compute[225855]: 2026-01-20 14:44:29.987 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:44:29 np0005588919 nova_compute[225855]: 2026-01-20 14:44:29.988 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:29 np0005588919 nova_compute[225855]: 2026-01-20 14:44:29.988 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:30.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:30 np0005588919 nova_compute[225855]: 2026-01-20 14:44:30.391 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3024534450' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:31.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:31 np0005588919 nova_compute[225855]: 2026-01-20 14:44:31.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:31 np0005588919 nova_compute[225855]: 2026-01-20 14:44:31.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:31 np0005588919 nova_compute[225855]: 2026-01-20 14:44:31.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:44:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:32.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:33 np0005588919 nova_compute[225855]: 2026-01-20 14:44:33.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3327996647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:33.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.364 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.364 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.365 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.365 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:34.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3129307005' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.862 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.989 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:44:34 np0005588919 nova_compute[225855]: 2026-01-20 14:44:34.990 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:44:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:35.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.458 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.459 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4342MB free_disk=20.915180206298828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.459 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.460 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.530 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.531 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.531 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:44:35 np0005588919 nova_compute[225855]: 2026-01-20 14:44:35.566 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2245713301' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:36 np0005588919 nova_compute[225855]: 2026-01-20 14:44:36.000 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:36 np0005588919 nova_compute[225855]: 2026-01-20 14:44:36.005 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:36 np0005588919 nova_compute[225855]: 2026-01-20 14:44:36.027 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:36 np0005588919 nova_compute[225855]: 2026-01-20 14:44:36.049 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:44:36 np0005588919 nova_compute[225855]: 2026-01-20 14:44:36.050 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:36.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:36 np0005588919 podman[259115]: 2026-01-20 14:44:36.478105558 +0000 UTC m=+0.053708622 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:44:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:37 np0005588919 nova_compute[225855]: 2026-01-20 14:44:37.044 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:37 np0005588919 nova_compute[225855]: 2026-01-20 14:44:37.044 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:37.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:38 np0005588919 nova_compute[225855]: 2026-01-20 14:44:38.190 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:38 np0005588919 nova_compute[225855]: 2026-01-20 14:44:38.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:44:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:38.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:38 np0005588919 nova_compute[225855]: 2026-01-20 14:44:38.954 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:38 np0005588919 nova_compute[225855]: 2026-01-20 14:44:38.954 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:38 np0005588919 nova_compute[225855]: 2026-01-20 14:44:38.969 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.035 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.035 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.041 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.041 225859 INFO nova.compute.claims [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.226 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:39.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:44:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4220284943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.669 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.675 225859 DEBUG nova.compute.provider_tree [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.699 225859 DEBUG nova.scheduler.client.report [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.731 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.732 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.828 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.829 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.861 225859 INFO nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.885 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:44:39 np0005588919 nova_compute[225855]: 2026-01-20 14:44:39.939 225859 INFO nova.virt.block_device [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Booting with volume 2441d1fb-fc23-4a6d-b88d-4d82b035b65f at /dev/vda#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.086 225859 DEBUG os_brick.utils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.088 225859 DEBUG nova.policy [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9051b1fd0e0b40c2be07afc6da803903', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '144d821b8f624db687f0e009c5e06d8b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.087 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.099 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.100 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[4e156cc0-2bba-4f6d-ad2c-c3153abdb748]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.101 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.109 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.109 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[40c9403a-398a-40c0-8f5d-04c3a3a3673e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.112 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.120 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.121 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[9e33cc23-1c23-4f53-ba70-d0893d6ba742]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.122 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[92ec4c17-b21c-4b93-9ed8-aaddd5afd883]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.122 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.151 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.155 225859 DEBUG os_brick.initiator.connectors.lightos [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.155 225859 DEBUG os_brick.initiator.connectors.lightos [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.156 225859 DEBUG os_brick.initiator.connectors.lightos [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.156 225859 DEBUG os_brick.utils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.157 225859 DEBUG nova.virt.block_device [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updating existing volume attachment record: 6b0d250e-a1b5-4326-abb8-6df5e007e9a0 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:44:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:40.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:40 np0005588919 nova_compute[225855]: 2026-01-20 14:44:40.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.148 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.149 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.150 225859 INFO nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Creating image(s)#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.150 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.150 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Ensure instance console log exists: /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.151 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.151 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.151 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:41.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:41 np0005588919 nova_compute[225855]: 2026-01-20 14:44:41.572 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Successfully created port: 019ea2f5-1721-42e7-9c77-4fc1599f8101 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:44:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:42.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:43 np0005588919 nova_compute[225855]: 2026-01-20 14:44:43.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:43.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:44:44 np0005588919 nova_compute[225855]: 2026-01-20 14:44:44.064 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Successfully updated port: 019ea2f5-1721-42e7-9c77-4fc1599f8101 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:44:44 np0005588919 nova_compute[225855]: 2026-01-20 14:44:44.085 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:44 np0005588919 nova_compute[225855]: 2026-01-20 14:44:44.085 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquired lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:44 np0005588919 nova_compute[225855]: 2026-01-20 14:44:44.085 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:44:44 np0005588919 nova_compute[225855]: 2026-01-20 14:44:44.343 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:44:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:44.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:45.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.438 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.500 225859 DEBUG nova.network.neutron [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updating instance_info_cache with network_info: [{"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.527 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Releasing lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.528 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Instance network_info: |[{"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.530 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Start _get_guest_xml network_info=[{"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-2441d1fb-fc23-4a6d-b88d-4d82b035b65f', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '2441d1fb-fc23-4a6d-b88d-4d82b035b65f', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'd236e7eb-2b7e-4031-b851-ae2790528213', 'attached_at': '', 'detached_at': '', 'volume_id': '2441d1fb-fc23-4a6d-b88d-4d82b035b65f', 'serial': '2441d1fb-fc23-4a6d-b88d-4d82b035b65f'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '6b0d250e-a1b5-4326-abb8-6df5e007e9a0', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.535 225859 WARNING nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.539 225859 DEBUG nova.virt.libvirt.host [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.540 225859 DEBUG nova.virt.libvirt.host [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.543 225859 DEBUG nova.virt.libvirt.host [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.544 225859 DEBUG nova.virt.libvirt.host [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.544 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.545 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.545 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.545 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.545 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.546 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.547 225859 DEBUG nova.virt.hardware [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.576 225859 DEBUG nova.storage.rbd_utils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] rbd image d236e7eb-2b7e-4031-b851-ae2790528213_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:45 np0005588919 nova_compute[225855]: 2026-01-20 14:44:45.582 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:44:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/504919309' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.026 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.154 225859 DEBUG nova.virt.libvirt.vif [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-2113030518',display_name='tempest-ServersTestBootFromVolume-server-2113030518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-2113030518',id=91,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCQAnzboP+S2NNqULNJ590LALw1e9CwKJIrHyMoISyM6baLxtf4y84xsP0kgRy7bjF2fbaXhodzuoV+0+uj6MQE6N4Q+sHthmobL8XMJ7dekwWVSr0yZf1dgnshwlxyeDQ==',key_name='tempest-keypair-1996933376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='144d821b8f624db687f0e009c5e06d8b',ramdisk_id='',reservation_id='r-55pxq2es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServersTestBootFromVolume-628592216',owner_user_name='tempest-ServersTestBootFromVolume-628592216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9051b1fd0e0b40c2be07afc6da803903',uuid=d236e7eb-2b7e-4031-b851-ae2790528213,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.154 225859 DEBUG nova.network.os_vif_util [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converting VIF {"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.157 225859 DEBUG nova.network.os_vif_util [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.158 225859 DEBUG nova.objects.instance [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lazy-loading 'pci_devices' on Instance uuid d236e7eb-2b7e-4031-b851-ae2790528213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.171 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <uuid>d236e7eb-2b7e-4031-b851-ae2790528213</uuid>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <name>instance-0000005b</name>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersTestBootFromVolume-server-2113030518</nova:name>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:44:45</nova:creationTime>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:user uuid="9051b1fd0e0b40c2be07afc6da803903">tempest-ServersTestBootFromVolume-628592216-project-member</nova:user>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:project uuid="144d821b8f624db687f0e009c5e06d8b">tempest-ServersTestBootFromVolume-628592216</nova:project>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <nova:port uuid="019ea2f5-1721-42e7-9c77-4fc1599f8101">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="serial">d236e7eb-2b7e-4031-b851-ae2790528213</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="uuid">d236e7eb-2b7e-4031-b851-ae2790528213</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d236e7eb-2b7e-4031-b851-ae2790528213_disk.config">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-2441d1fb-fc23-4a6d-b88d-4d82b035b65f">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <serial>2441d1fb-fc23-4a6d-b88d-4d82b035b65f</serial>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:08:2c:78"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <target dev="tap019ea2f5-17"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/console.log" append="off"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:44:46 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:44:46 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:44:46 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:44:46 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.171 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Preparing to wait for external event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.171 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.172 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.172 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.172 225859 DEBUG nova.virt.libvirt.vif [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-2113030518',display_name='tempest-ServersTestBootFromVolume-server-2113030518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-2113030518',id=91,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCQAnzboP+S2NNqULNJ590LALw1e9CwKJIrHyMoISyM6baLxtf4y84xsP0kgRy7bjF2fbaXhodzuoV+0+uj6MQE6N4Q+sHthmobL8XMJ7dekwWVSr0yZf1dgnshwlxyeDQ==',key_name='tempest-keypair-1996933376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='144d821b8f624db687f0e009c5e06d8b',ramdisk_id='',reservation_id='r-55pxq2es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServersTestBootFromVolume-628592216',owner_user_name='tempest-ServersTestBootFromVolume-628592216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:44:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9051b1fd0e0b40c2be07afc6da803903',uuid=d236e7eb-2b7e-4031-b851-ae2790528213,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.173 225859 DEBUG nova.network.os_vif_util [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converting VIF {"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.173 225859 DEBUG nova.network.os_vif_util [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.173 225859 DEBUG os_vif [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.174 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.174 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.175 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.177 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap019ea2f5-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.178 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap019ea2f5-17, col_values=(('external_ids', {'iface-id': '019ea2f5-1721-42e7-9c77-4fc1599f8101', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:2c:78', 'vm-uuid': 'd236e7eb-2b7e-4031-b851-ae2790528213'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:46 np0005588919 NetworkManager[49104]: <info>  [1768920286.2204] manager: (tap019ea2f5-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.227 225859 INFO os_vif [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17')#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.268 225859 DEBUG nova.compute.manager [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-changed-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.269 225859 DEBUG nova.compute.manager [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Refreshing instance network info cache due to event network-changed-019ea2f5-1721-42e7-9c77-4fc1599f8101. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.269 225859 DEBUG oslo_concurrency.lockutils [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.269 225859 DEBUG oslo_concurrency.lockutils [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.270 225859 DEBUG nova.network.neutron [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Refreshing network info cache for port 019ea2f5-1721-42e7-9c77-4fc1599f8101 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.278 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.278 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.279 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] No VIF found with MAC fa:16:3e:08:2c:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.279 225859 INFO nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Using config drive#033[00m
Jan 20 09:44:46 np0005588919 nova_compute[225855]: 2026-01-20 14:44:46.306 225859 DEBUG nova.storage.rbd_utils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] rbd image d236e7eb-2b7e-4031-b851-ae2790528213_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:46.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:47.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:48.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.554 225859 INFO nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Creating config drive at /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.560 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9rdjtwtx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.696 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9rdjtwtx" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.728 225859 DEBUG nova.storage.rbd_utils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] rbd image d236e7eb-2b7e-4031-b851-ae2790528213_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.732 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config d236e7eb-2b7e-4031-b851-ae2790528213_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.901 225859 DEBUG oslo_concurrency.processutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config d236e7eb-2b7e-4031-b851-ae2790528213_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.169s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.903 225859 INFO nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Deleting local config drive /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213/disk.config because it was imported into RBD.#033[00m
Jan 20 09:44:48 np0005588919 kernel: tap019ea2f5-17: entered promiscuous mode
Jan 20 09:44:48 np0005588919 NetworkManager[49104]: <info>  [1768920288.9610] manager: (tap019ea2f5-17): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.962 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:48Z|00305|binding|INFO|Claiming lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 for this chassis.
Jan 20 09:44:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:48Z|00306|binding|INFO|019ea2f5-1721-42e7-9c77-4fc1599f8101: Claiming fa:16:3e:08:2c:78 10.100.0.10
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.969 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:2c:78 10.100.0.10'], port_security=['fa:16:3e:08:2c:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd236e7eb-2b7e-4031-b851-ae2790528213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef55bf0-ad6a-4f88-adac-d746a869d579', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '144d821b8f624db687f0e009c5e06d8b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4307ffb0-d161-4f09-96d3-b205cef524f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd338232-5bc6-43af-b249-74124e0134b1, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=019ea2f5-1721-42e7-9c77-4fc1599f8101) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.971 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 019ea2f5-1721-42e7-9c77-4fc1599f8101 in datapath eef55bf0-ad6a-4f88-adac-d746a869d579 bound to our chassis#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.972 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eef55bf0-ad6a-4f88-adac-d746a869d579#033[00m
Jan 20 09:44:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:48Z|00307|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 ovn-installed in OVS
Jan 20 09:44:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:48Z|00308|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 up in Southbound
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.979 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588919 nova_compute[225855]: 2026-01-20 14:44:48.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.989 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2414cedf-0c5f-4c98-b208-9d5c41073bab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.990 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeef55bf0-a1 in ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.991 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeef55bf0-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.992 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2ff404-e733-4ae7-9a87-1324ca8be9fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:48.993 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eef08c78-f4f1-4cdc-afec-2e9ba01e1f01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 systemd-machined[194361]: New machine qemu-37-instance-0000005b.
Jan 20 09:44:49 np0005588919 systemd-udevd[259335]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.009 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[0824b572-9fe9-4c57-b099-54abc208b435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 NetworkManager[49104]: <info>  [1768920289.0193] device (tap019ea2f5-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:44:49 np0005588919 NetworkManager[49104]: <info>  [1768920289.0207] device (tap019ea2f5-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:44:49 np0005588919 systemd[1]: Started Virtual Machine qemu-37-instance-0000005b.
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.022 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9423cf99-c019-4340-80c4-b3e34269cadc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.051 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f710b10f-4326-4270-bf7f-5e6a7d08c212]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 systemd-udevd[259338]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:44:49 np0005588919 NetworkManager[49104]: <info>  [1768920289.0576] manager: (tapeef55bf0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.056 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cc235f8c-89c9-4d75-9085-e28738848db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.085 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b242a10a-a8cb-4af9-9a4d-214726344ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.088 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3250e2-2120-41c2-ba19-907246190745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 NetworkManager[49104]: <info>  [1768920289.1134] device (tapeef55bf0-a0): carrier: link connected
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.118 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[03abe4a0-dcc6-4ff6-83f0-23cc77f75936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.137 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a53cb708-f4e8-457a-94af-05c28a49ce7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeef55bf0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:1e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532804, 'reachable_time': 22563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259366, 'error': None, 'target': 'ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.153 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c36c6bba-9294-4196-849a-6c5d2555f5ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:1e69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532804, 'tstamp': 532804}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259367, 'error': None, 'target': 'ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.173 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[48201e73-b18e-4575-9e71-7436ad914b6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeef55bf0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:1e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532804, 'reachable_time': 22563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259368, 'error': None, 'target': 'ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.203 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[711947f5-1326-411b-b586-7911c77df1e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.267 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b91f042a-3b64-412e-bec4-f9c968b83178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.269 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeef55bf0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.269 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.270 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeef55bf0-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:49 np0005588919 kernel: tapeef55bf0-a0: entered promiscuous mode
Jan 20 09:44:49 np0005588919 NetworkManager[49104]: <info>  [1768920289.2723] manager: (tapeef55bf0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.275 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeef55bf0-a0, col_values=(('external_ids', {'iface-id': '7520a61f-574f-4683-b47f-71e915a4dabe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:49Z|00309|binding|INFO|Releasing lport 7520a61f-574f-4683-b47f-71e915a4dabe from this chassis (sb_readonly=0)
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.279 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.279 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eef55bf0-ad6a-4f88-adac-d746a869d579.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eef55bf0-ad6a-4f88-adac-d746a869d579.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.280 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[55390bde-881a-4201-a973-2362c455c670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.281 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-eef55bf0-ad6a-4f88-adac-d746a869d579
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/eef55bf0-ad6a-4f88-adac-d746a869d579.pid.haproxy
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID eef55bf0-ad6a-4f88-adac-d746a869d579
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:44:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:49.282 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579', 'env', 'PROCESS_TAG=haproxy-eef55bf0-ad6a-4f88-adac-d746a869d579', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eef55bf0-ad6a-4f88-adac-d746a869d579.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:49.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.323 225859 DEBUG nova.compute.manager [req-4d036d0e-36de-4cf3-ae70-b7da9992c98b req-61424c07-aead-4154-bb08-20d13036545c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.323 225859 DEBUG oslo_concurrency.lockutils [req-4d036d0e-36de-4cf3-ae70-b7da9992c98b req-61424c07-aead-4154-bb08-20d13036545c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.324 225859 DEBUG oslo_concurrency.lockutils [req-4d036d0e-36de-4cf3-ae70-b7da9992c98b req-61424c07-aead-4154-bb08-20d13036545c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.324 225859 DEBUG oslo_concurrency.lockutils [req-4d036d0e-36de-4cf3-ae70-b7da9992c98b req-61424c07-aead-4154-bb08-20d13036545c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.324 225859 DEBUG nova.compute.manager [req-4d036d0e-36de-4cf3-ae70-b7da9992c98b req-61424c07-aead-4154-bb08-20d13036545c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Processing event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.479 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920289.4788468, d236e7eb-2b7e-4031-b851-ae2790528213 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.480 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] VM Started (Lifecycle Event)#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.482 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.486 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.491 225859 INFO nova.virt.libvirt.driver [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Instance spawned successfully.#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.491 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.496 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.500 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.503 225859 DEBUG nova.network.neutron [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updated VIF entry in instance network info cache for port 019ea2f5-1721-42e7-9c77-4fc1599f8101. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.503 225859 DEBUG nova.network.neutron [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updating instance_info_cache with network_info: [{"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.511 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.511 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.512 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.512 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.513 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.513 225859 DEBUG nova.virt.libvirt.driver [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.520 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.521 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920289.479793, d236e7eb-2b7e-4031-b851-ae2790528213 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.524 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.546 225859 DEBUG oslo_concurrency.lockutils [req-67efbc5e-1efe-45e7-bc12-9e7d78b41365 req-965e8d8f-f43f-40b6-9665-38cc3d54deae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.549 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.552 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920289.4853854, d236e7eb-2b7e-4031-b851-ae2790528213 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.552 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.575 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.578 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.586 225859 INFO nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Took 8.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.587 225859 DEBUG nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.595 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.637 225859 INFO nova.compute.manager [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Took 10.62 seconds to build instance.#033[00m
Jan 20 09:44:49 np0005588919 nova_compute[225855]: 2026-01-20 14:44:49.658 225859 DEBUG oslo_concurrency.lockutils [None req-2a13799f-b3e4-48ea-b45d-a9254b040ace 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:49 np0005588919 podman[259441]: 2026-01-20 14:44:49.662533394 +0000 UTC m=+0.051120759 container create e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:44:49 np0005588919 systemd[1]: Started libpod-conmon-e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69.scope.
Jan 20 09:44:49 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:44:49 np0005588919 podman[259441]: 2026-01-20 14:44:49.63664248 +0000 UTC m=+0.025229865 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:44:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6c68af6525807938ca92d3080ecb780abe59857167dc3a5929ab0345a12e53c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:44:49 np0005588919 podman[259441]: 2026-01-20 14:44:49.745839123 +0000 UTC m=+0.134426498 container init e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:44:49 np0005588919 podman[259441]: 2026-01-20 14:44:49.751177304 +0000 UTC m=+0.139764669 container start e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 09:44:49 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [NOTICE]   (259460) : New worker (259462) forked
Jan 20 09:44:49 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [NOTICE]   (259460) : Loading success.
Jan 20 09:44:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:50.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:50 np0005588919 nova_compute[225855]: 2026-01-20 14:44:50.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.219 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:51.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.397 225859 DEBUG nova.compute.manager [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.398 225859 DEBUG oslo_concurrency.lockutils [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.398 225859 DEBUG oslo_concurrency.lockutils [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.398 225859 DEBUG oslo_concurrency.lockutils [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.399 225859 DEBUG nova.compute.manager [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:44:51 np0005588919 nova_compute[225855]: 2026-01-20 14:44:51.399 225859 WARNING nova.compute.manager [req-8e5bd89f-3d5d-4529-a8fe-e41a5cdcef66 req-a65f3f1a-7082-4368-ab75-7de7d4d0a109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received unexpected event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:44:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:52.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:53Z|00310|binding|INFO|Releasing lport 7520a61f-574f-4683-b47f-71e915a4dabe from this chassis (sb_readonly=0)
Jan 20 09:44:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:44:53Z|00311|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:44:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:53.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.336 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.495 225859 DEBUG nova.compute.manager [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-changed-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.495 225859 DEBUG nova.compute.manager [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Refreshing instance network info cache due to event network-changed-019ea2f5-1721-42e7-9c77-4fc1599f8101. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.495 225859 DEBUG oslo_concurrency.lockutils [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.495 225859 DEBUG oslo_concurrency.lockutils [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:44:53 np0005588919 nova_compute[225855]: 2026-01-20 14:44:53.496 225859 DEBUG nova.network.neutron [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Refreshing network info cache for port 019ea2f5-1721-42e7-9c77-4fc1599f8101 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:44:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:55 np0005588919 nova_compute[225855]: 2026-01-20 14:44:55.240 225859 DEBUG nova.network.neutron [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updated VIF entry in instance network info cache for port 019ea2f5-1721-42e7-9c77-4fc1599f8101. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:44:55 np0005588919 nova_compute[225855]: 2026-01-20 14:44:55.241 225859 DEBUG nova.network.neutron [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updating instance_info_cache with network_info: [{"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:44:55 np0005588919 nova_compute[225855]: 2026-01-20 14:44:55.258 225859 DEBUG oslo_concurrency.lockutils [req-73f25326-0c8b-49b2-b95e-4d925f81d3d3 req-43a2cb58-9fb1-4a95-9099-5159aaf8e54f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d236e7eb-2b7e-4031-b851-ae2790528213" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:44:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:55.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:55.331 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:44:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:44:55.334 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:44:55 np0005588919 nova_compute[225855]: 2026-01-20 14:44:55.375 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:55 np0005588919 nova_compute[225855]: 2026-01-20 14:44:55.443 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588919 nova_compute[225855]: 2026-01-20 14:44:56.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:44:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:44:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:56.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:44:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:44:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:57.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:44:58 np0005588919 podman[259475]: 2026-01-20 14:44:58.039654676 +0000 UTC m=+0.073220365 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:44:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:44:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:44:58.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:44:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:44:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:44:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:44:59.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:00.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:00 np0005588919 nova_compute[225855]: 2026-01-20 14:45:00.444 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:01 np0005588919 nova_compute[225855]: 2026-01-20 14:45:01.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:01.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:02 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:02.336 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:02 np0005588919 nova_compute[225855]: 2026-01-20 14:45:02.555 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:03.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:45:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:45:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:04Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:08:2c:78 10.100.0.10
Jan 20 09:45:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:04Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:08:2c:78 10.100.0.10
Jan 20 09:45:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:05.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:05 np0005588919 nova_compute[225855]: 2026-01-20 14:45:05.447 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:05 np0005588919 nova_compute[225855]: 2026-01-20 14:45:05.996 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:06 np0005588919 nova_compute[225855]: 2026-01-20 14:45:06.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:06.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:07 np0005588919 podman[259635]: 2026-01-20 14:45:07.012961574 +0000 UTC m=+0.058197200 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:45:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:07.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:08.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:09.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:10.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:10 np0005588919 nova_compute[225855]: 2026-01-20 14:45:10.450 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:11 np0005588919 nova_compute[225855]: 2026-01-20 14:45:11.230 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:11.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:12.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:45:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:13.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:14.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:15.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:15 np0005588919 nova_compute[225855]: 2026-01-20 14:45:15.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:16 np0005588919 nova_compute[225855]: 2026-01-20 14:45:16.232 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:16.406 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:16.406 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:16.407 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:16.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:17.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:45:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:18.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:19.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:20.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:20 np0005588919 nova_compute[225855]: 2026-01-20 14:45:20.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588919 nova_compute[225855]: 2026-01-20 14:45:21.234 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:45:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:21.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:45:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:22.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:23.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:24.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:25.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:25 np0005588919 nova_compute[225855]: 2026-01-20 14:45:25.513 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.081778) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326081919, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2337, "num_deletes": 251, "total_data_size": 5186721, "memory_usage": 5269544, "flush_reason": "Manual Compaction"}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326117040, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3397927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39133, "largest_seqno": 41465, "table_properties": {"data_size": 3388811, "index_size": 5546, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20370, "raw_average_key_size": 20, "raw_value_size": 3370006, "raw_average_value_size": 3400, "num_data_blocks": 242, "num_entries": 991, "num_filter_entries": 991, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920134, "oldest_key_time": 1768920134, "file_creation_time": 1768920326, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 35318 microseconds, and 15907 cpu microseconds.
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.117085) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3397927 bytes OK
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.117115) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.118506) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.118519) EVENT_LOG_v1 {"time_micros": 1768920326118515, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.118539) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5176303, prev total WAL file size 5176303, number of live WAL files 2.
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.119858) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3318KB)], [75(9563KB)]
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326119896, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 13190636, "oldest_snapshot_seqno": -1}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6698 keys, 11262128 bytes, temperature: kUnknown
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326233943, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 11262128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11216324, "index_size": 27964, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 171844, "raw_average_key_size": 25, "raw_value_size": 11095333, "raw_average_value_size": 1656, "num_data_blocks": 1115, "num_entries": 6698, "num_filter_entries": 6698, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920326, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.234246) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 11262128 bytes
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.235748) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 115.6 rd, 98.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 9.3 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(7.2) write-amplify(3.3) OK, records in: 7213, records dropped: 515 output_compression: NoCompression
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.235765) EVENT_LOG_v1 {"time_micros": 1768920326235757, "job": 46, "event": "compaction_finished", "compaction_time_micros": 114125, "compaction_time_cpu_micros": 25815, "output_level": 6, "num_output_files": 1, "total_output_size": 11262128, "num_input_records": 7213, "num_output_records": 6698, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:45:26 np0005588919 nova_compute[225855]: 2026-01-20 14:45:26.235 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326236508, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920326238251, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.119768) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238372) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:45:26.238374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:45:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:26.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:27.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.822 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.822 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.823 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.823 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.823 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.824 225859 INFO nova.compute.manager [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Terminating instance#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.826 225859 DEBUG nova.compute.manager [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:45:27 np0005588919 kernel: tap019ea2f5-17 (unregistering): left promiscuous mode
Jan 20 09:45:27 np0005588919 NetworkManager[49104]: <info>  [1768920327.8827] device (tap019ea2f5-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:45:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:27Z|00312|binding|INFO|Releasing lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 from this chassis (sb_readonly=0)
Jan 20 09:45:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:27Z|00313|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 down in Southbound
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:27Z|00314|binding|INFO|Removing iface tap019ea2f5-17 ovn-installed in OVS
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:27.924 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:2c:78 10.100.0.10'], port_security=['fa:16:3e:08:2c:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd236e7eb-2b7e-4031-b851-ae2790528213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef55bf0-ad6a-4f88-adac-d746a869d579', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '144d821b8f624db687f0e009c5e06d8b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4307ffb0-d161-4f09-96d3-b205cef524f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd338232-5bc6-43af-b249-74124e0134b1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=019ea2f5-1721-42e7-9c77-4fc1599f8101) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:27.925 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 019ea2f5-1721-42e7-9c77-4fc1599f8101 in datapath eef55bf0-ad6a-4f88-adac-d746a869d579 unbound from our chassis#033[00m
Jan 20 09:45:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:27.927 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eef55bf0-ad6a-4f88-adac-d746a869d579, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:27.928 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1532fcb6-4d42-4eaf-aa2e-3bbb83776119]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:27.929 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579 namespace which is not needed anymore#033[00m
Jan 20 09:45:27 np0005588919 nova_compute[225855]: 2026-01-20 14:45:27.933 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:27 np0005588919 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Jan 20 09:45:27 np0005588919 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005b.scope: Consumed 14.468s CPU time.
Jan 20 09:45:27 np0005588919 systemd-machined[194361]: Machine qemu-37-instance-0000005b terminated.
Jan 20 09:45:28 np0005588919 kernel: tap019ea2f5-17: entered promiscuous mode
Jan 20 09:45:28 np0005588919 kernel: tap019ea2f5-17 (unregistering): left promiscuous mode
Jan 20 09:45:28 np0005588919 NetworkManager[49104]: <info>  [1768920328.0473] manager: (tap019ea2f5-17): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00315|binding|INFO|Claiming lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 for this chassis.
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00316|binding|INFO|019ea2f5-1721-42e7-9c77-4fc1599f8101: Claiming fa:16:3e:08:2c:78 10.100.0.10
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.057 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:2c:78 10.100.0.10'], port_security=['fa:16:3e:08:2c:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd236e7eb-2b7e-4031-b851-ae2790528213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef55bf0-ad6a-4f88-adac-d746a869d579', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '144d821b8f624db687f0e009c5e06d8b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4307ffb0-d161-4f09-96d3-b205cef524f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd338232-5bc6-43af-b249-74124e0134b1, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=019ea2f5-1721-42e7-9c77-4fc1599f8101) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:28 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [NOTICE]   (259460) : haproxy version is 2.8.14-c23fe91
Jan 20 09:45:28 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [NOTICE]   (259460) : path to executable is /usr/sbin/haproxy
Jan 20 09:45:28 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [WARNING]  (259460) : Exiting Master process...
Jan 20 09:45:28 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [ALERT]    (259460) : Current worker (259462) exited with code 143 (Terminated)
Jan 20 09:45:28 np0005588919 neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579[259456]: [WARNING]  (259460) : All workers exited. Exiting... (0)
Jan 20 09:45:28 np0005588919 systemd[1]: libpod-e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69.scope: Deactivated successfully.
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.065 225859 INFO nova.virt.libvirt.driver [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Instance destroyed successfully.#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.066 225859 DEBUG nova.objects.instance [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lazy-loading 'resources' on Instance uuid d236e7eb-2b7e-4031-b851-ae2790528213 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00317|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 ovn-installed in OVS
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00318|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 up in Southbound
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00319|binding|INFO|Releasing lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 from this chassis (sb_readonly=1)
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00320|if_status|INFO|Dropped 2 log messages in last 270 seconds (most recently, 270 seconds ago) due to excessive rate
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00321|if_status|INFO|Not setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 down as sb is readonly
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00322|binding|INFO|Removing iface tap019ea2f5-17 ovn-installed in OVS
Jan 20 09:45:28 np0005588919 podman[259840]: 2026-01-20 14:45:28.075732464 +0000 UTC m=+0.055604056 container died e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00323|binding|INFO|Releasing lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 from this chassis (sb_readonly=0)
Jan 20 09:45:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:28Z|00324|binding|INFO|Setting lport 019ea2f5-1721-42e7-9c77-4fc1599f8101 down in Southbound
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.090 225859 DEBUG nova.virt.libvirt.vif [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestBootFromVolume-server-2113030518',display_name='tempest-ServersTestBootFromVolume-server-2113030518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestbootfromvolume-server-2113030518',id=91,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCQAnzboP+S2NNqULNJ590LALw1e9CwKJIrHyMoISyM6baLxtf4y84xsP0kgRy7bjF2fbaXhodzuoV+0+uj6MQE6N4Q+sHthmobL8XMJ7dekwWVSr0yZf1dgnshwlxyeDQ==',key_name='tempest-keypair-1996933376',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='144d821b8f624db687f0e009c5e06d8b',ramdisk_id='',reservation_id='r-55pxq2es',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-ServersTestBootFromVolume-628592216',owner_user_name='tempest-ServersTestBootFromVolume-628592216-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:44:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9051b1fd0e0b40c2be07afc6da803903',uuid=d236e7eb-2b7e-4031-b851-ae2790528213,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.091 225859 DEBUG nova.network.os_vif_util [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converting VIF {"id": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "address": "fa:16:3e:08:2c:78", "network": {"id": "eef55bf0-ad6a-4f88-adac-d746a869d579", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-101625742-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "144d821b8f624db687f0e009c5e06d8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap019ea2f5-17", "ovs_interfaceid": "019ea2f5-1721-42e7-9c77-4fc1599f8101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.093 225859 DEBUG nova.network.os_vif_util [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.093 225859 DEBUG os_vif [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.094 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:2c:78 10.100.0.10'], port_security=['fa:16:3e:08:2c:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd236e7eb-2b7e-4031-b851-ae2790528213', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eef55bf0-ad6a-4f88-adac-d746a869d579', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '144d821b8f624db687f0e009c5e06d8b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4307ffb0-d161-4f09-96d3-b205cef524f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd338232-5bc6-43af-b249-74124e0134b1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=019ea2f5-1721-42e7-9c77-4fc1599f8101) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.098 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap019ea2f5-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.099 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.103 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:45:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e6c68af6525807938ca92d3080ecb780abe59857167dc3a5929ab0345a12e53c-merged.mount: Deactivated successfully.
Jan 20 09:45:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69-userdata-shm.mount: Deactivated successfully.
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.107 225859 INFO os_vif [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:2c:78,bridge_name='br-int',has_traffic_filtering=True,id=019ea2f5-1721-42e7-9c77-4fc1599f8101,network=Network(eef55bf0-ad6a-4f88-adac-d746a869d579),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap019ea2f5-17')#033[00m
Jan 20 09:45:28 np0005588919 podman[259840]: 2026-01-20 14:45:28.117030644 +0000 UTC m=+0.096902216 container cleanup e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:45:28 np0005588919 systemd[1]: libpod-conmon-e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69.scope: Deactivated successfully.
Jan 20 09:45:28 np0005588919 podman[259860]: 2026-01-20 14:45:28.17481773 +0000 UTC m=+0.079597745 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.180 225859 DEBUG nova.compute.manager [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-unplugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.180 225859 DEBUG oslo_concurrency.lockutils [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.181 225859 DEBUG oslo_concurrency.lockutils [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.181 225859 DEBUG oslo_concurrency.lockutils [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.181 225859 DEBUG nova.compute.manager [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-unplugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.181 225859 DEBUG nova.compute.manager [req-6f42f92e-38d2-470e-90cb-7e3d224630a0 req-34cfbf53-c60d-40e1-8f51-a27ebd18c424 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-unplugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:45:28 np0005588919 podman[259899]: 2026-01-20 14:45:28.186922503 +0000 UTC m=+0.045234562 container remove e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.191 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0dd34c5-a525-4cb8-a116-e2099750caf9]: (4, ('Tue Jan 20 02:45:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579 (e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69)\ne01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69\nTue Jan 20 02:45:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579 (e01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69)\ne01522ec633af0a019b4440a7dd3965b02d373680203c840ebe0316866c9cf69\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.193 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2919de25-91b0-4dd4-8b35-522e6e909ff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.194 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeef55bf0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.195 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 kernel: tapeef55bf0-a0: left promiscuous mode
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.208 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.210 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bab3b0cc-9e66-42b2-94ef-92fa604d1707]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.227 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[92f7e8f5-3162-4913-8f3c-04a0dffc86de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.228 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c44c6c6-4503-4598-bf00-59d56a322a71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.241 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6212af40-4166-4336-a67e-f68746b4ef84]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532797, 'reachable_time': 36404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259935, 'error': None, 'target': 'ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.244 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eef55bf0-ad6a-4f88-adac-d746a869d579 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.244 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe33c11-8142-464a-beb6-b4294b9c4ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 systemd[1]: run-netns-ovnmeta\x2deef55bf0\x2dad6a\x2d4f88\x2dadac\x2dd746a869d579.mount: Deactivated successfully.
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.245 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 019ea2f5-1721-42e7-9c77-4fc1599f8101 in datapath eef55bf0-ad6a-4f88-adac-d746a869d579 unbound from our chassis#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.247 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eef55bf0-ad6a-4f88-adac-d746a869d579, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.247 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4556303e-a86b-4108-8362-cc08f636d4b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.248 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 019ea2f5-1721-42e7-9c77-4fc1599f8101 in datapath eef55bf0-ad6a-4f88-adac-d746a869d579 unbound from our chassis#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.250 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eef55bf0-ad6a-4f88-adac-d746a869d579, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:45:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:28.250 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[617c3531-d293-4e08-a112-d08b95f0469a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.307 225859 INFO nova.virt.libvirt.driver [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Deleting instance files /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213_del#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.307 225859 INFO nova.virt.libvirt.driver [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Deletion of /var/lib/nova/instances/d236e7eb-2b7e-4031-b851-ae2790528213_del complete#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.375 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.383 225859 INFO nova.compute.manager [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.383 225859 DEBUG oslo.service.loopingcall [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.384 225859 DEBUG nova.compute.manager [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.384 225859 DEBUG nova.network.neutron [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:45:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:28.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.643 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.644 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.644 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:45:28 np0005588919 nova_compute[225855]: 2026-01-20 14:45:28.645 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:29.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:29 np0005588919 nova_compute[225855]: 2026-01-20 14:45:29.713 225859 DEBUG nova.network.neutron [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:29 np0005588919 nova_compute[225855]: 2026-01-20 14:45:29.750 225859 INFO nova.compute.manager [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Took 1.37 seconds to deallocate network for instance.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.022 225859 INFO nova.compute.manager [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Took 0.27 seconds to detach 1 volumes for instance.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.024 225859 DEBUG nova.compute.manager [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Deleting volume: 2441d1fb-fc23-4a6d-b88d-4d82b035b65f _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.090 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.113 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.113 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.114 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.114 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.209 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.209 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.275 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.276 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.277 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.277 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.277 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.278 225859 WARNING nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received unexpected event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.278 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.278 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.279 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.279 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.279 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.279 225859 WARNING nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received unexpected event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.280 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.280 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.280 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.281 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.281 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.281 225859 WARNING nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received unexpected event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.281 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.282 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.282 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.282 225859 DEBUG oslo_concurrency.lockutils [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.283 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] No waiting events found dispatching network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.283 225859 WARNING nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received unexpected event network-vif-plugged-019ea2f5-1721-42e7-9c77-4fc1599f8101 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.283 225859 DEBUG nova.compute.manager [req-282c4f80-0292-4e02-8ff6-d44e45a351c6 req-e48e2abf-6fdd-4742-9e27-294927ef0d31 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Received event network-vif-deleted-019ea2f5-1721-42e7-9c77-4fc1599f8101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.307 225859 DEBUG oslo_concurrency.processutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:30.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.568 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1005838163' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.744 225859 DEBUG oslo_concurrency.processutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.749 225859 DEBUG nova.compute.provider_tree [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1978237855' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:45:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1978237855' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.806 225859 DEBUG nova.scheduler.client.report [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.828 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.860 225859 INFO nova.scheduler.client.report [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Deleted allocations for instance d236e7eb-2b7e-4031-b851-ae2790528213#033[00m
Jan 20 09:45:30 np0005588919 nova_compute[225855]: 2026-01-20 14:45:30.937 225859 DEBUG oslo_concurrency.lockutils [None req-bcb4d055-d85b-43f3-970b-703623f4eb3b 9051b1fd0e0b40c2be07afc6da803903 144d821b8f624db687f0e009c5e06d8b - - default default] Lock "d236e7eb-2b7e-4031-b851-ae2790528213" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:31.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:32 np0005588919 nova_compute[225855]: 2026-01-20 14:45:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:32 np0005588919 nova_compute[225855]: 2026-01-20 14:45:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:32 np0005588919 nova_compute[225855]: 2026-01-20 14:45:32.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:45:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:32.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:33 np0005588919 nova_compute[225855]: 2026-01-20 14:45:33.100 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:33.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:34.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:35 np0005588919 nova_compute[225855]: 2026-01-20 14:45:35.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:35.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:35 np0005588919 nova_compute[225855]: 2026-01-20 14:45:35.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:35Z|00325|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.041 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.366 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.366 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.366 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.367 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.367 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 20 09:45:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:36.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2360956372' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.841 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.901 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:45:36 np0005588919 nova_compute[225855]: 2026-01-20 14:45:36.902 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.051 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.052 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4352MB free_disk=20.8660888671875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.053 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.053 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.347 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.347 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.348 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:45:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:37.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:37 np0005588919 nova_compute[225855]: 2026-01-20 14:45:37.398 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 20 09:45:38 np0005588919 podman[260005]: 2026-01-20 14:45:38.001151885 +0000 UTC m=+0.049483442 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.103 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3888261964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.290 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.892s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.295 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.316 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.338 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:45:38 np0005588919 nova_compute[225855]: 2026-01-20 14:45:38.339 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:38.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:39 np0005588919 nova_compute[225855]: 2026-01-20 14:45:39.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:45:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:39.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 20 09:45:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:40.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:40 np0005588919 nova_compute[225855]: 2026-01-20 14:45:40.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:41.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.183 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.184 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.219 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.304 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.304 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.311 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.312 225859 INFO nova.compute.claims [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:45:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:42.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.493 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:45:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1623992898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.906 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.913 225859 DEBUG nova.compute.provider_tree [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.926 225859 DEBUG nova.scheduler.client.report [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.942 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.943 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.991 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:45:42 np0005588919 nova_compute[225855]: 2026-01-20 14:45:42.992 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.024 225859 INFO nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.049 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.062 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920328.0617414, d236e7eb-2b7e-4031-b851-ae2790528213 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.063 225859 INFO nova.compute.manager [-] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.090 225859 DEBUG nova.compute.manager [None req-fb0885cb-eb63-46a5-989c-dc6aacdaabd9 - - - - - -] [instance: d236e7eb-2b7e-4031-b851-ae2790528213] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.106 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.181 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.182 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.182 225859 INFO nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating image(s)#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.205 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.231 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.259 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.262 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.298 225859 DEBUG nova.policy [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '869086208e10436c9dc96c78bee9a85d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b683fcc0026242e28ba6d8fba638688e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.355 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.356 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.356 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.356 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.380 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:45:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:43.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:43 np0005588919 nova_compute[225855]: 2026-01-20 14:45:43.384 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:44.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:44 np0005588919 nova_compute[225855]: 2026-01-20 14:45:44.595 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Successfully created port: ec1a7a25-a60a-40c3-98bf-710c68019b24 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.073 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.688s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.136 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.189 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.319 225859 DEBUG nova.objects.instance [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.336 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.337 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Ensure instance console log exists: /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.338 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.338 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.338 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:45.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.547 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Successfully updated port: ec1a7a25-a60a-40c3-98bf-710c68019b24 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.569 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.570 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.570 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.604 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.729 225859 DEBUG nova.compute.manager [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-changed-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.730 225859 DEBUG nova.compute.manager [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Refreshing instance network info cache due to event network-changed-ec1a7a25-a60a-40c3-98bf-710c68019b24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.731 225859 DEBUG oslo_concurrency.lockutils [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:45 np0005588919 nova_compute[225855]: 2026-01-20 14:45:45.869 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:45:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:46.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.238 225859 DEBUG nova.network.neutron [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating instance_info_cache with network_info: [{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.262 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.263 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance network_info: |[{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.263 225859 DEBUG oslo_concurrency.lockutils [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.264 225859 DEBUG nova.network.neutron [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Refreshing network info cache for port ec1a7a25-a60a-40c3-98bf-710c68019b24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.269 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start _get_guest_xml network_info=[{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.275 225859 WARNING nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.283 225859 DEBUG nova.virt.libvirt.host [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.284 225859 DEBUG nova.virt.libvirt.host [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.289 225859 DEBUG nova.virt.libvirt.host [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.290 225859 DEBUG nova.virt.libvirt.host [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.291 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.291 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.291 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.291 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.292 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.293 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.293 225859 DEBUG nova.virt.hardware [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.295 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:47.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3346481726' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.772 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.799 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:47 np0005588919 nova_compute[225855]: 2026-01-20 14:45:47.803 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.109 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:45:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/703686633' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:45:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:48.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.538 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.735s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.540 225859 DEBUG nova.virt.libvirt.vif [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-tempest.common.compute-instance-1159529074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.541 225859 DEBUG nova.network.os_vif_util [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.541 225859 DEBUG nova.network.os_vif_util [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.543 225859 DEBUG nova.objects.instance [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.557 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <uuid>5f56b3e9-af2b-4934-8184-6257994c6b6a</uuid>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <name>instance-0000005f</name>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-tempest.common.compute-instance-1159529074</nova:name>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:45:47</nova:creationTime>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <nova:port uuid="ec1a7a25-a60a-40c3-98bf-710c68019b24">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="serial">5f56b3e9-af2b-4934-8184-6257994c6b6a</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="uuid">5f56b3e9-af2b-4934-8184-6257994c6b6a</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f56b3e9-af2b-4934-8184-6257994c6b6a_disk">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:76:c2:c8"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <target dev="tapec1a7a25-a6"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/console.log" append="off"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:45:48 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:45:48 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:45:48 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:45:48 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.558 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Preparing to wait for external event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.559 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.559 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.559 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.560 225859 DEBUG nova.virt.libvirt.vif [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-tempest.common.compute-instance-1159529074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:45:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.560 225859 DEBUG nova.network.os_vif_util [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.561 225859 DEBUG nova.network.os_vif_util [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.561 225859 DEBUG os_vif [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.562 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.562 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.562 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.565 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.566 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec1a7a25-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.566 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec1a7a25-a6, col_values=(('external_ids', {'iface-id': 'ec1a7a25-a60a-40c3-98bf-710c68019b24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:c2:c8', 'vm-uuid': '5f56b3e9-af2b-4934-8184-6257994c6b6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.568 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:48 np0005588919 NetworkManager[49104]: <info>  [1768920348.5695] manager: (tapec1a7a25-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.571 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.577 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.577 225859 INFO os_vif [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6')#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.679 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.679 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.679 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:76:c2:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.680 225859 INFO nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Using config drive#033[00m
Jan 20 09:45:48 np0005588919 nova_compute[225855]: 2026-01-20 14:45:48.751 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.066 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:49.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.417 225859 INFO nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating config drive at /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.423 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zjenn5r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.457 225859 DEBUG nova.network.neutron [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updated VIF entry in instance network info cache for port ec1a7a25-a60a-40c3-98bf-710c68019b24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.458 225859 DEBUG nova.network.neutron [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating instance_info_cache with network_info: [{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.472 225859 DEBUG oslo_concurrency.lockutils [req-f50ed7ab-228a-4843-980d-24e38a82dc08 req-7976d171-feb5-48ca-9297-8b660b552caa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.575 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zjenn5r" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.794 225859 DEBUG nova.storage.rbd_utils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:45:49 np0005588919 nova_compute[225855]: 2026-01-20 14:45:49.800 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.355 225859 DEBUG oslo_concurrency.processutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.357 225859 INFO nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deleting local config drive /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config because it was imported into RBD.#033[00m
Jan 20 09:45:50 np0005588919 kernel: tapec1a7a25-a6: entered promiscuous mode
Jan 20 09:45:50 np0005588919 NetworkManager[49104]: <info>  [1768920350.4074] manager: (tapec1a7a25-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Jan 20 09:45:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:50Z|00326|binding|INFO|Claiming lport ec1a7a25-a60a-40c3-98bf-710c68019b24 for this chassis.
Jan 20 09:45:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:50Z|00327|binding|INFO|ec1a7a25-a60a-40c3-98bf-710c68019b24: Claiming fa:16:3e:76:c2:c8 10.100.0.5
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.411 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.418 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:c2:c8 10.100.0.5'], port_security=['fa:16:3e:76:c2:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f56b3e9-af2b-4934-8184-6257994c6b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '357f03e7-2463-4315-adee-f60d9b0f5500', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ec1a7a25-a60a-40c3-98bf-710c68019b24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.420 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ec1a7a25-a60a-40c3-98bf-710c68019b24 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.422 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.440 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0a1642-f664-4186-b587-1098143f1243]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 systemd-udevd[260406]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:45:50 np0005588919 systemd-machined[194361]: New machine qemu-38-instance-0000005f.
Jan 20 09:45:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:50Z|00328|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 ovn-installed in OVS
Jan 20 09:45:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:45:50Z|00329|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 up in Southbound
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.450 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.452 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 NetworkManager[49104]: <info>  [1768920350.4535] device (tapec1a7a25-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:45:50 np0005588919 NetworkManager[49104]: <info>  [1768920350.4540] device (tapec1a7a25-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:45:50 np0005588919 systemd[1]: Started Virtual Machine qemu-38-instance-0000005f.
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.474 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d329292b-960d-4c1a-8d63-4f06b98aedd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.478 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2a91d27c-7bff-48fe-bb04-3ba82ae24fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:50.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.511 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f32cbae3-ad30-4ee2-9b21-bb05affed1cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.527 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29b2ea06-5166-4f5a-9484-8024fd5d2140]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260419, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.543 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5017b386-8e67-4366-b6be-b876088ccac3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260420, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260420, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.545 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.547 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.548 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.550 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.550 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.550 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.551 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.606 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.729 225859 DEBUG nova.compute.manager [req-ec100eb5-649d-4ab4-a314-b2b439d6effc req-dbe6206b-dc54-4661-9609-aab0a76a27ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.729 225859 DEBUG oslo_concurrency.lockutils [req-ec100eb5-649d-4ab4-a314-b2b439d6effc req-dbe6206b-dc54-4661-9609-aab0a76a27ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.729 225859 DEBUG oslo_concurrency.lockutils [req-ec100eb5-649d-4ab4-a314-b2b439d6effc req-dbe6206b-dc54-4661-9609-aab0a76a27ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.730 225859 DEBUG oslo_concurrency.lockutils [req-ec100eb5-649d-4ab4-a314-b2b439d6effc req-dbe6206b-dc54-4661-9609-aab0a76a27ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.730 225859 DEBUG nova.compute.manager [req-ec100eb5-649d-4ab4-a314-b2b439d6effc req-dbe6206b-dc54-4661-9609-aab0a76a27ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Processing event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.826 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:45:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:50.827 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:45:50 np0005588919 nova_compute[225855]: 2026-01-20 14:45:50.872 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:51.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.765 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920351.7653728, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.766 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.768 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.771 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.774 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance spawned successfully.#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.774 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.795 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.798 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.805 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.806 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.806 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.807 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.807 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.808 225859 DEBUG nova.virt.libvirt.driver [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.830 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.830 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920351.7662647, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.831 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.859 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.863 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920351.7707226, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.863 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.867 225859 INFO nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Took 8.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.867 225859 DEBUG nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.894 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.897 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.932 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.946 225859 INFO nova.compute.manager [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Took 9.67 seconds to build instance.#033[00m
Jan 20 09:45:51 np0005588919 nova_compute[225855]: 2026-01-20 14:45:51.962 225859 DEBUG oslo_concurrency.lockutils [None req-370cb0b5-817d-4c64-98bf-e48aa665a7a1 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:52.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:45:52.830 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.857 225859 DEBUG nova.compute.manager [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.858 225859 DEBUG oslo_concurrency.lockutils [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.858 225859 DEBUG oslo_concurrency.lockutils [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.859 225859 DEBUG oslo_concurrency.lockutils [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.859 225859 DEBUG nova.compute.manager [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:45:52 np0005588919 nova_compute[225855]: 2026-01-20 14:45:52.859 225859 WARNING nova.compute.manager [req-8db341f1-a817-4c47-bb49-1f468bf9deef req-1089f9b1-43a0-4700-9ff6-8659f6bab3ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:45:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:45:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:53.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:45:53 np0005588919 nova_compute[225855]: 2026-01-20 14:45:53.569 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:53 np0005588919 nova_compute[225855]: 2026-01-20 14:45:53.891 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:54.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 20 09:45:54 np0005588919 nova_compute[225855]: 2026-01-20 14:45:54.940 225859 DEBUG nova.compute.manager [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-changed-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:45:54 np0005588919 nova_compute[225855]: 2026-01-20 14:45:54.941 225859 DEBUG nova.compute.manager [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Refreshing instance network info cache due to event network-changed-ec1a7a25-a60a-40c3-98bf-710c68019b24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:45:54 np0005588919 nova_compute[225855]: 2026-01-20 14:45:54.941 225859 DEBUG oslo_concurrency.lockutils [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:45:54 np0005588919 nova_compute[225855]: 2026-01-20 14:45:54.942 225859 DEBUG oslo_concurrency.lockutils [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:45:54 np0005588919 nova_compute[225855]: 2026-01-20 14:45:54.942 225859 DEBUG nova.network.neutron [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Refreshing network info cache for port ec1a7a25-a60a-40c3-98bf-710c68019b24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:45:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:55 np0005588919 nova_compute[225855]: 2026-01-20 14:45:55.609 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:56.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:45:56 np0005588919 nova_compute[225855]: 2026-01-20 14:45:56.642 225859 DEBUG nova.network.neutron [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updated VIF entry in instance network info cache for port ec1a7a25-a60a-40c3-98bf-710c68019b24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:45:56 np0005588919 nova_compute[225855]: 2026-01-20 14:45:56.644 225859 DEBUG nova.network.neutron [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating instance_info_cache with network_info: [{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:45:56 np0005588919 nova_compute[225855]: 2026-01-20 14:45:56.668 225859 DEBUG oslo_concurrency.lockutils [req-3de9020a-2f72-497b-8787-b9c0a8706f26 req-0ecd794d-3802-42fe-9b7c-80d54016406b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-5f56b3e9-af2b-4934-8184-6257994c6b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:45:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:45:57 np0005588919 nova_compute[225855]: 2026-01-20 14:45:57.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:45:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:45:58.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:45:58 np0005588919 nova_compute[225855]: 2026-01-20 14:45:58.571 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:45:59 np0005588919 podman[260467]: 2026-01-20 14:45:59.041807297 +0000 UTC m=+0.086123650 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:45:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:45:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:45:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:45:59.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:00.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:00 np0005588919 nova_compute[225855]: 2026-01-20 14:46:00.611 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:01.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:02.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:03 np0005588919 nova_compute[225855]: 2026-01-20 14:46:03.574 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 20 09:46:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:04.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:05.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:05 np0005588919 nova_compute[225855]: 2026-01-20 14:46:05.613 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:06.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:07.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:08Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:76:c2:c8 10.100.0.5
Jan 20 09:46:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:08Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:76:c2:c8 10.100.0.5
Jan 20 09:46:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:08.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:08 np0005588919 nova_compute[225855]: 2026-01-20 14:46:08.612 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:09 np0005588919 podman[260549]: 2026-01-20 14:46:09.019970832 +0000 UTC m=+0.054188755 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:46:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:09.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:10.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:10 np0005588919 nova_compute[225855]: 2026-01-20 14:46:10.615 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:11.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:12.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:13.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:13 np0005588919 nova_compute[225855]: 2026-01-20 14:46:13.614 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:46:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/514796829' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:46:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:46:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/514796829' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:46:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:46:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:46:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:14.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:15.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:15 np0005588919 nova_compute[225855]: 2026-01-20 14:46:15.617 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:16.407 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:16.407 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:16.408 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:16.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:17.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:18.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:18 np0005588919 nova_compute[225855]: 2026-01-20 14:46:18.662 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:20.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:20 np0005588919 nova_compute[225855]: 2026-01-20 14:46:20.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:21.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:22.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:23.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:23 np0005588919 nova_compute[225855]: 2026-01-20 14:46:23.664 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:46:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:24.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:25.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:25 np0005588919 nova_compute[225855]: 2026-01-20 14:46:25.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:26.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:46:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:27.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.069 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.070 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.086 225859 DEBUG nova.objects.instance [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'flavor' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.131 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.473 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.473 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.474 225859 INFO nova.compute.manager [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attaching volume aab03340-0e79-412e-a963-e216832603c4 to /dev/vdb#033[00m
Jan 20 09:46:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:28.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.648 225859 DEBUG os_brick.utils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.649 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.660 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.660 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[4d63a53d-89f3-4042-a043-58f5856c2b4d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.662 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.710 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.711 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[1df69b61-a6be-4ce5-a73c-2ceb97627750]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.713 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.720 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.720 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[cb653a98-6d80-4cbd-950c-76a3ab70d14f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.721 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[6e80d0d1-12a5-4a8c-ab48-906a5f64a643]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.722 225859 DEBUG oslo_concurrency.processutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.745 225859 DEBUG oslo_concurrency.processutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.748 225859 DEBUG os_brick.initiator.connectors.lightos [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.748 225859 DEBUG os_brick.initiator.connectors.lightos [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.748 225859 DEBUG os_brick.initiator.connectors.lightos [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.749 225859 DEBUG os_brick.utils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] <== get_connector_properties: return (99ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:46:28 np0005588919 nova_compute[225855]: 2026-01-20 14:46:28.749 225859 DEBUG nova.virt.block_device [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating existing volume attachment record: 4982b9c6-6475-4038-b1e1-764e3501e6c5 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:46:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:46:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:46:29 np0005588919 nova_compute[225855]: 2026-01-20 14:46:29.640 225859 DEBUG nova.objects.instance [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'flavor' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:29 np0005588919 nova_compute[225855]: 2026-01-20 14:46:29.661 225859 DEBUG nova.virt.libvirt.driver [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attempting to attach volume aab03340-0e79-412e-a963-e216832603c4 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:46:29 np0005588919 nova_compute[225855]: 2026-01-20 14:46:29.664 225859 DEBUG nova.virt.libvirt.guest [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-aab03340-0e79-412e-a963-e216832603c4">
Jan 20 09:46:29 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:46:29 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:29 np0005588919 nova_compute[225855]:  <serial>aab03340-0e79-412e-a963-e216832603c4</serial>
Jan 20 09:46:29 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:46:29 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.025 225859 DEBUG nova.virt.libvirt.driver [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.026 225859 DEBUG nova.virt.libvirt.driver [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.027 225859 DEBUG nova.virt.libvirt.driver [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.027 225859 DEBUG nova.virt.libvirt.driver [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:76:c2:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:46:30 np0005588919 podman[260839]: 2026-01-20 14:46:30.038804147 +0000 UTC m=+0.081016035 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.223 225859 DEBUG oslo_concurrency.lockutils [None req-d2635026-981d-4a32-8767-4963aa4021cb 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:46:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:30.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.582 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.582 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.582 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.582 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:30 np0005588919 nova_compute[225855]: 2026-01-20 14:46:30.623 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.275 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.296 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.296 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.297 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.297 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:46:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:32.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:32 np0005588919 nova_compute[225855]: 2026-01-20 14:46:32.838 225859 INFO nova.compute.manager [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Rebuilding instance#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.076 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.092 225859 DEBUG nova.compute.manager [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.165 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_requests' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.180 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.193 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.205 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.216 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.220 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:33.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:33 np0005588919 nova_compute[225855]: 2026-01-20 14:46:33.712 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:34.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:35.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:35 np0005588919 kernel: tapec1a7a25-a6 (unregistering): left promiscuous mode
Jan 20 09:46:35 np0005588919 NetworkManager[49104]: <info>  [1768920395.5054] device (tapec1a7a25-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.518 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:35Z|00330|binding|INFO|Releasing lport ec1a7a25-a60a-40c3-98bf-710c68019b24 from this chassis (sb_readonly=0)
Jan 20 09:46:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:35Z|00331|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 down in Southbound
Jan 20 09:46:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:35Z|00332|binding|INFO|Removing iface tapec1a7a25-a6 ovn-installed in OVS
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.527 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:c2:c8 10.100.0.5'], port_security=['fa:16:3e:76:c2:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f56b3e9-af2b-4934-8184-6257994c6b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '357f03e7-2463-4315-adee-f60d9b0f5500', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ec1a7a25-a60a-40c3-98bf-710c68019b24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.529 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ec1a7a25-a60a-40c3-98bf-710c68019b24 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.532 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.538 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.549 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8d31ee-0b0c-4515-90f9-165c869f81cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 20 09:46:35 np0005588919 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005f.scope: Consumed 14.016s CPU time.
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.584 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2164f7d9-6783-4eb4-8674-67d0706f1ce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 systemd-machined[194361]: Machine qemu-38-instance-0000005f terminated.
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.587 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8fa9f8-8076-4a45-88d3-6645f60443c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.621 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[15ded6df-d496-48ef-97c3-9f8b594fffc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.626 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.637 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d39d3ed6-86d3-421a-ad22-d722fbcf6327]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260880, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.653 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7fc794-924e-4efc-9ba7-147111f6d149]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260881, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260881, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.655 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.656 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.662 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.663 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.663 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.663 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:35.664 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:35 np0005588919 kernel: tapec1a7a25-a6: entered promiscuous mode
Jan 20 09:46:35 np0005588919 kernel: tapec1a7a25-a6 (unregistering): left promiscuous mode
Jan 20 09:46:35 np0005588919 NetworkManager[49104]: <info>  [1768920395.7588] manager: (tapec1a7a25-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Jan 20 09:46:35 np0005588919 nova_compute[225855]: 2026-01-20 14:46:35.803 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.234 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.240 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance destroyed successfully.#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.424 225859 INFO nova.compute.manager [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Detaching volume aab03340-0e79-412e-a963-e216832603c4#033[00m
Jan 20 09:46:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:46:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:36.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.567 225859 INFO nova.virt.block_device [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attempting to driver detach volume aab03340-0e79-412e-a963-e216832603c4 from mountpoint /dev/vdb#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.574 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Attempting to detach device vdb from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.575 225859 DEBUG nova.virt.libvirt.guest [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-aab03340-0e79-412e-a963-e216832603c4">
Jan 20 09:46:36 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  <serial>aab03340-0e79-412e-a963-e216832603c4</serial>
Jan 20 09:46:36 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:46:36 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:46:36 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.590 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully detached device vdb from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the persistent domain config.#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.848 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance destroyed successfully.#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.850 225859 DEBUG nova.virt.libvirt.vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-ServerActionsTestOtherA-server-1358184153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:46:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.850 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.851 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.852 225859 DEBUG os_vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.854 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.855 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec1a7a25-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.910 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.913 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:36 np0005588919 nova_compute[225855]: 2026-01-20 14:46:36.916 225859 INFO os_vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6')#033[00m
Jan 20 09:46:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.260 225859 DEBUG nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.261 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.262 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.262 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.263 225859 DEBUG nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.264 225859 WARNING nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.264 225859 DEBUG nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.265 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.265 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.266 225859 DEBUG oslo_concurrency.lockutils [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.267 225859 DEBUG nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.267 225859 WARNING nova.compute.manager [req-bbbf1ebc-b7a9-44c6-9394-23afd4eb75dd req-78a539c9-855c-4428-9882-c1be1da3bf04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.358 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.359 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.360 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.361 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:37.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4020167201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.817 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.893 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000005f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.894 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000005f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.898 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:37 np0005588919 nova_compute[225855]: 2026-01-20 14:46:37.898 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.075 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.077 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4319MB free_disk=20.76354217529297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.078 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.078 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.145 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.145 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 5f56b3e9-af2b-4934-8184-6257994c6b6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.146 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.146 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.221 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:38.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/104493319' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.659 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.666 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.685 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.711 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:46:38 np0005588919 nova_compute[225855]: 2026-01-20 14:46:38.712 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:39.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:40 np0005588919 podman[260958]: 2026-01-20 14:46:40.008634594 +0000 UTC m=+0.060451353 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.025 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deleting instance files /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a_del#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.026 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deletion of /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a_del complete#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.227 225859 INFO nova.virt.block_device [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Booting with volume aab03340-0e79-412e-a963-e216832603c4 at /dev/vdb#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.392 225859 DEBUG os_brick.utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.394 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.411 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.411 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[916320ea-6e43-4f02-937d-b6893d058bd9]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.413 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.427 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.427 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[7654e334-f246-4e08-848c-86dbdd302318]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.429 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.442 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.443 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[46a7fa98-095f-4686-a9ba-840a3a13a4a7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.445 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[8b409b59-249d-44ef-8cbf-2b2cf9827abd]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.446 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.486 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "nvme version" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.491 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.491 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.492 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.493 225859 DEBUG os_brick.utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] <== get_connector_properties: return (99ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.495 225859 DEBUG nova.virt.block_device [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating existing volume attachment record: c9861cd5-6ebe-4ef9-bc30-869a54cc88ab _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:46:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:40 np0005588919 nova_compute[225855]: 2026-01-20 14:46:40.627 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:41.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.772 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.773 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating image(s)#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.819 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.864 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.903 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.908 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.941 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.996 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.997 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.997 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:41 np0005588919 nova_compute[225855]: 2026-01-20 14:46:41.998 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:42 np0005588919 nova_compute[225855]: 2026-01-20 14:46:42.021 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:42 np0005588919 nova_compute[225855]: 2026-01-20 14:46:42.024 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:42.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:42 np0005588919 nova_compute[225855]: 2026-01-20 14:46:42.902 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.878s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:42 np0005588919 nova_compute[225855]: 2026-01-20 14:46:42.971 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.170 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.171 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Ensure instance console log exists: /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.172 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.172 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.173 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.176 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start _get_guest_xml network_info=[{"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-aab03340-0e79-412e-a963-e216832603c4', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'aab03340-0e79-412e-a963-e216832603c4', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '5f56b3e9-af2b-4934-8184-6257994c6b6a', 'attached_at': '', 'detached_at': '', 'volume_id': 'aab03340-0e79-412e-a963-e216832603c4', 'serial': 'aab03340-0e79-412e-a963-e216832603c4'}, 'guest_format': None, 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': 'c9861cd5-6ebe-4ef9-bc30-869a54cc88ab', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.180 225859 WARNING nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.185 225859 DEBUG nova.virt.libvirt.host [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.185 225859 DEBUG nova.virt.libvirt.host [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.188 225859 DEBUG nova.virt.libvirt.host [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.188 225859 DEBUG nova.virt.libvirt.host [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.190 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.190 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.190 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.191 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.191 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.191 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.192 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.192 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.192 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.192 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.193 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.193 225859 DEBUG nova.virt.hardware [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.193 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.217 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:43.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2699107769' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.685 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.709 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:43 np0005588919 nova_compute[225855]: 2026-01-20 14:46:43.713 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:46:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3551339405' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.118 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.138 225859 DEBUG nova.virt.libvirt.vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-ServerActionsTestOtherA-server-1358184153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:46:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.139 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.139 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.142 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <uuid>5f56b3e9-af2b-4934-8184-6257994c6b6a</uuid>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <name>instance-0000005f</name>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherA-server-1358184153</nova:name>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:46:43</nova:creationTime>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <nova:port uuid="ec1a7a25-a60a-40c3-98bf-710c68019b24">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="serial">5f56b3e9-af2b-4934-8184-6257994c6b6a</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="uuid">5f56b3e9-af2b-4934-8184-6257994c6b6a</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f56b3e9-af2b-4934-8184-6257994c6b6a_disk">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-aab03340-0e79-412e-a963-e216832603c4">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <serial>aab03340-0e79-412e-a963-e216832603c4</serial>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:76:c2:c8"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <target dev="tapec1a7a25-a6"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/console.log" append="off"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:46:44 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:46:44 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:46:44 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:46:44 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.144 225859 DEBUG nova.virt.libvirt.vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-ServerActionsTestOtherA-server-1358184153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:46:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.145 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.146 225859 DEBUG nova.network.os_vif_util [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.147 225859 DEBUG os_vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.148 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.149 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.150 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.154 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.155 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec1a7a25-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.156 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec1a7a25-a6, col_values=(('external_ids', {'iface-id': 'ec1a7a25-a60a-40c3-98bf-710c68019b24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:c2:c8', 'vm-uuid': '5f56b3e9-af2b-4934-8184-6257994c6b6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:44 np0005588919 NetworkManager[49104]: <info>  [1768920404.1595] manager: (tapec1a7a25-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.161 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.163 225859 INFO os_vif [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6')#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.208 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.208 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.209 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.209 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:76:c2:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.209 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Using config drive#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.233 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.253 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.346 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'keypairs' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:44.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.707 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.852 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Creating config drive at /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.858 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgshuq8rj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:44 np0005588919 nova_compute[225855]: 2026-01-20 14:46:44.987 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgshuq8rj" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.017 225859 DEBUG nova.storage.rbd_utils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.021 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.189 225859 DEBUG oslo_concurrency.processutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config 5f56b3e9-af2b-4934-8184-6257994c6b6a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.190 225859 INFO nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deleting local config drive /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a/disk.config because it was imported into RBD.#033[00m
Jan 20 09:46:45 np0005588919 kernel: tapec1a7a25-a6: entered promiscuous mode
Jan 20 09:46:45 np0005588919 NetworkManager[49104]: <info>  [1768920405.2374] manager: (tapec1a7a25-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Jan 20 09:46:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:45Z|00333|binding|INFO|Claiming lport ec1a7a25-a60a-40c3-98bf-710c68019b24 for this chassis.
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:45Z|00334|binding|INFO|ec1a7a25-a60a-40c3-98bf-710c68019b24: Claiming fa:16:3e:76:c2:c8 10.100.0.5
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.269 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:c2:c8 10.100.0.5'], port_security=['fa:16:3e:76:c2:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f56b3e9-af2b-4934-8184-6257994c6b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '357f03e7-2463-4315-adee-f60d9b0f5500', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ec1a7a25-a60a-40c3-98bf-710c68019b24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.270 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ec1a7a25-a60a-40c3-98bf-710c68019b24 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.272 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:46:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:45Z|00335|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 ovn-installed in OVS
Jan 20 09:46:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:45Z|00336|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 up in Southbound
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.279 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.281 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.289 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a4732483-9c55-438e-8604-24f603c1e39e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 systemd-machined[194361]: New machine qemu-39-instance-0000005f.
Jan 20 09:46:45 np0005588919 systemd-udevd[261288]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:46:45 np0005588919 systemd[1]: Started Virtual Machine qemu-39-instance-0000005f.
Jan 20 09:46:45 np0005588919 NetworkManager[49104]: <info>  [1768920405.3079] device (tapec1a7a25-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:46:45 np0005588919 NetworkManager[49104]: <info>  [1768920405.3087] device (tapec1a7a25-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.334 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2c91af9c-def7-4984-a3ae-8225dfbc873a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.338 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[30a32d5f-5c6e-4f40-b778-0d9ab720031f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.367 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c1adc412-aee2-473c-b83a-66d781cb2e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.381 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9acded25-d45e-4d0b-bcc5-578887ee98ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261300, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.398 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9ed6a7-13db-4d08-90d9-f849c9aa9454]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261302, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261302, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.400 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.401 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.403 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.403 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.404 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:45.404 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:45.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.629 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.856 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 5f56b3e9-af2b-4934-8184-6257994c6b6a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.856 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920405.855787, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.857 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.859 225859 DEBUG nova.compute.manager [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.859 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.862 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance spawned successfully.#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.863 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.943 225859 DEBUG nova.compute.manager [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.944 225859 DEBUG oslo_concurrency.lockutils [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.944 225859 DEBUG oslo_concurrency.lockutils [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.944 225859 DEBUG oslo_concurrency.lockutils [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.944 225859 DEBUG nova.compute.manager [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.944 225859 WARNING nova.compute.manager [req-64f0176b-9305-4a43-a8ed-ca45fee2da2a req-1e594b73-675e-4012-9322-6a35672bd7ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.967 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.971 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.971 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.971 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.972 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.972 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.972 225859 DEBUG nova.virt.libvirt.driver [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:46:45 np0005588919 nova_compute[225855]: 2026-01-20 14:46:45.976 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.010 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.011 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920405.856396, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.011 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.032 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.035 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.040 225859 DEBUG nova.compute.manager [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.051 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.100 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.100 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.100 225859 DEBUG nova.objects.instance [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:46:46 np0005588919 nova_compute[225855]: 2026-01-20 14:46:46.207 225859 DEBUG oslo_concurrency.lockutils [None req-b4c774ae-1e22-4f05-b320-7417a7a22674 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:46.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.414 225859 DEBUG nova.compute.manager [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.415 225859 DEBUG oslo_concurrency.lockutils [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.416 225859 DEBUG oslo_concurrency.lockutils [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.416 225859 DEBUG oslo_concurrency.lockutils [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.417 225859 DEBUG nova.compute.manager [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:48 np0005588919 nova_compute[225855]: 2026-01-20 14:46:48.417 225859 WARNING nova.compute.manager [req-e3219eea-fff0-4a70-bbfc-1609e1e36460 req-4a87b953-a057-40aa-abe8-bdda71cd5687 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:46:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:48.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:49 np0005588919 nova_compute[225855]: 2026-01-20 14:46:49.159 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.356 225859 DEBUG oslo_concurrency.lockutils [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.357 225859 DEBUG oslo_concurrency.lockutils [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.382 225859 INFO nova.compute.manager [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Detaching volume aab03340-0e79-412e-a963-e216832603c4#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.525 225859 INFO nova.virt.block_device [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Attempting to driver detach volume aab03340-0e79-412e-a963-e216832603c4 from mountpoint /dev/vdb#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.537 225859 DEBUG nova.virt.libvirt.driver [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Attempting to detach device vdb from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.538 225859 DEBUG nova.virt.libvirt.guest [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-aab03340-0e79-412e-a963-e216832603c4">
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <serial>aab03340-0e79-412e-a963-e216832603c4</serial>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:46:50 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:46:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:50.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.632 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.707 225859 INFO nova.virt.libvirt.driver [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully detached device vdb from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the persistent domain config.#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.707 225859 DEBUG nova.virt.libvirt.driver [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:46:50 np0005588919 nova_compute[225855]: 2026-01-20 14:46:50.708 225859 DEBUG nova.virt.libvirt.guest [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-aab03340-0e79-412e-a963-e216832603c4">
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <serial>aab03340-0e79-412e-a963-e216832603c4</serial>
Jan 20 09:46:50 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 09:46:50 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:46:50 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:46:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:51.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:52.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:52 np0005588919 nova_compute[225855]: 2026-01-20 14:46:52.988 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768920412.9880133, 5f56b3e9-af2b-4934-8184-6257994c6b6a => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:46:52 np0005588919 nova_compute[225855]: 2026-01-20 14:46:52.991 225859 DEBUG nova.virt.libvirt.driver [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 5f56b3e9-af2b-4934-8184-6257994c6b6a _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:46:52 np0005588919 nova_compute[225855]: 2026-01-20 14:46:52.993 225859 INFO nova.virt.libvirt.driver [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully detached device vdb from instance 5f56b3e9-af2b-4934-8184-6257994c6b6a from the live domain config.#033[00m
Jan 20 09:46:53 np0005588919 nova_compute[225855]: 2026-01-20 14:46:53.204 225859 DEBUG nova.objects.instance [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'flavor' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:53 np0005588919 nova_compute[225855]: 2026-01-20 14:46:53.253 225859 DEBUG oslo_concurrency.lockutils [None req-6c457338-0d70-4054-b038-16e6ecdb36c4 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:53.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.085 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.086 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.086 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.086 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.086 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.088 225859 INFO nova.compute.manager [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Terminating instance#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.089 225859 DEBUG nova.compute.manager [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:46:54 np0005588919 kernel: tapec1a7a25-a6 (unregistering): left promiscuous mode
Jan 20 09:46:54 np0005588919 NetworkManager[49104]: <info>  [1768920414.1289] device (tapec1a7a25-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:46:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:54Z|00337|binding|INFO|Releasing lport ec1a7a25-a60a-40c3-98bf-710c68019b24 from this chassis (sb_readonly=0)
Jan 20 09:46:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:54Z|00338|binding|INFO|Setting lport ec1a7a25-a60a-40c3-98bf-710c68019b24 down in Southbound
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.130 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:46:54Z|00339|binding|INFO|Removing iface tapec1a7a25-a6 ovn-installed in OVS
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.133 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.140 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:c2:c8 10.100.0.5'], port_security=['fa:16:3e:76:c2:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5f56b3e9-af2b-4934-8184-6257994c6b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '357f03e7-2463-4315-adee-f60d9b0f5500', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ec1a7a25-a60a-40c3-98bf-710c68019b24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.143 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ec1a7a25-a60a-40c3-98bf-710c68019b24 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.146 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.148 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.160 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.168 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51641b97-68e5-43d9-8508-30eb0bf8bed8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 20 09:46:54 np0005588919 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000005f.scope: Consumed 8.961s CPU time.
Jan 20 09:46:54 np0005588919 systemd-machined[194361]: Machine qemu-39-instance-0000005f terminated.
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.204 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7194507d-65fd-441c-b50a-ad6fdb7579f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.210 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[16ed3782-050b-4b57-aa16-2068e161e51c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.251 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[57469ab4-b585-4e5f-9196-d1b488ab8f91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.273 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dc128960-156c-4dd4-b6c2-ac99bd792f00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261432, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.299 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d4dd4440-7abd-40ff-8a7b-55a0d85ca031]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261433, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261433, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.301 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.303 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.309 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.309 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.310 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:54.310 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.337 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Instance destroyed successfully.#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.338 225859 DEBUG nova.objects.instance [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 5f56b3e9-af2b-4934-8184-6257994c6b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:46:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:54.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.688 225859 DEBUG nova.virt.libvirt.vif [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:45:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1159529074',display_name='tempest-ServerActionsTestOtherA-server-1358184153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1159529074',id=95,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQdfZ1kte1uwABu8Yt6GDBCbfLz/7DDRluLfcLvQPu0p1+5pmkgwEfXHuElB+zR6CnPfotcAwIhafKnCBnLfMP+a/6KQqDXsYrSlHbKZFIppFb1eVRM7RProDBMZxNI4Q==',key_name='tempest-keypair-1773413892',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:46:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-0vpr4wvx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:46:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=5f56b3e9-af2b-4934-8184-6257994c6b6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.689 225859 DEBUG nova.network.os_vif_util [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "address": "fa:16:3e:76:c2:c8", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec1a7a25-a6", "ovs_interfaceid": "ec1a7a25-a60a-40c3-98bf-710c68019b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.691 225859 DEBUG nova.network.os_vif_util [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.692 225859 DEBUG os_vif [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.695 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.696 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec1a7a25-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.698 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.701 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:46:54 np0005588919 nova_compute[225855]: 2026-01-20 14:46:54.704 225859 INFO os_vif [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:c2:c8,bridge_name='br-int',has_traffic_filtering=True,id=ec1a7a25-a60a-40c3-98bf-710c68019b24,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec1a7a25-a6')#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.127 225859 INFO nova.virt.libvirt.driver [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deleting instance files /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a_del#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.128 225859 INFO nova.virt.libvirt.driver [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deletion of /var/lib/nova/instances/5f56b3e9-af2b-4934-8184-6257994c6b6a_del complete#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.199 225859 INFO nova.compute.manager [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.200 225859 DEBUG oslo.service.loopingcall [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.200 225859 DEBUG nova.compute.manager [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.201 225859 DEBUG nova.network.neutron [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:46:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:55.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:55.461 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:46:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:46:55.462 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.462 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.555 225859 DEBUG nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.556 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.556 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.556 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.557 225859 DEBUG nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.557 225859 DEBUG nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-unplugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.557 225859 DEBUG nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.557 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.557 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.558 225859 DEBUG oslo_concurrency.lockutils [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.558 225859 DEBUG nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] No waiting events found dispatching network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.558 225859 WARNING nova.compute.manager [req-75e7d446-eaae-4984-8052-8be9d3913d08 req-dee04e90-2208-490b-b5a1-51ccd1f718f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received unexpected event network-vif-plugged-ec1a7a25-a60a-40c3-98bf-710c68019b24 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:46:55 np0005588919 nova_compute[225855]: 2026-01-20 14:46:55.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:46:56 np0005588919 nova_compute[225855]: 2026-01-20 14:46:56.321 225859 DEBUG nova.network.neutron [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:46:56 np0005588919 nova_compute[225855]: 2026-01-20 14:46:56.519 225859 INFO nova.compute.manager [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Took 1.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:46:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:56.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:56 np0005588919 nova_compute[225855]: 2026-01-20 14:46:56.638 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:46:56 np0005588919 nova_compute[225855]: 2026-01-20 14:46:56.639 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:46:56 np0005588919 nova_compute[225855]: 2026-01-20 14:46:56.715 225859 DEBUG oslo_concurrency.processutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:46:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:46:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:46:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/876713242' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.168 225859 DEBUG oslo_concurrency.processutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.179 225859 DEBUG nova.compute.provider_tree [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.220 225859 DEBUG nova.scheduler.client.report [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.291 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.331 225859 INFO nova.scheduler.client.report [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocations for instance 5f56b3e9-af2b-4934-8184-6257994c6b6a#033[00m
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.408 225859 DEBUG oslo_concurrency.lockutils [None req-2cd59ed3-d902-4d19-b251-a0edc6e0598c 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "5f56b3e9-af2b-4934-8184-6257994c6b6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:46:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:46:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:57.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:46:57 np0005588919 nova_compute[225855]: 2026-01-20 14:46:57.658 225859 DEBUG nova.compute.manager [req-07a02586-fe04-4644-b47a-f663a1823ff3 req-207c0a5a-82b0-4b18-969a-25a4cd8cff9c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Received event network-vif-deleted-ec1a7a25-a60a-40c3-98bf-710c68019b24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:46:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:46:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:46:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:46:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:46:59.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:46:59 np0005588919 nova_compute[225855]: 2026-01-20 14:46:59.699 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:00.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:00 np0005588919 nova_compute[225855]: 2026-01-20 14:47:00.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:01 np0005588919 podman[261489]: 2026-01-20 14:47:01.055651545 +0000 UTC m=+0.104316496 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:47:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:01.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:02.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:03.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:04.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:04 np0005588919 nova_compute[225855]: 2026-01-20 14:47:04.700 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:05.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:05.464 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.786 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.787 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.810 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.894 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.894 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.902 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:47:05 np0005588919 nova_compute[225855]: 2026-01-20 14:47:05.903 225859 INFO nova.compute.claims [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.006 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3563148152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.437 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.446 225859 DEBUG nova.compute.provider_tree [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.461 225859 DEBUG nova.scheduler.client.report [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.483 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.484 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.528 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.529 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.555 225859 INFO nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.571 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:47:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:06.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.648 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.649 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.650 225859 INFO nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Creating image(s)#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.674 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.698 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.720 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.722 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.782 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.783 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.784 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.784 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.806 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.809 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:06 np0005588919 nova_compute[225855]: 2026-01-20 14:47:06.832 225859 DEBUG nova.policy [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '869086208e10436c9dc96c78bee9a85d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b683fcc0026242e28ba6d8fba638688e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:47:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.181 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.260 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] resizing rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.371 225859 DEBUG nova.objects.instance [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'migration_context' on Instance uuid 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.386 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.386 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Ensure instance console log exists: /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.387 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.387 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.387 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:07.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:07 np0005588919 nova_compute[225855]: 2026-01-20 14:47:07.477 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Successfully created port: b93181ae-8a01-468c-adfc-ec8894512d2e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:47:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.159 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Successfully updated port: b93181ae-8a01-468c-adfc-ec8894512d2e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.191 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.191 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.191 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.335 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920414.3349535, 5f56b3e9-af2b-4934-8184-6257994c6b6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.336 225859 INFO nova.compute.manager [-] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.340 225859 DEBUG nova.compute.manager [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-changed-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.340 225859 DEBUG nova.compute.manager [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Refreshing instance network info cache due to event network-changed-b93181ae-8a01-468c-adfc-ec8894512d2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.340 225859 DEBUG oslo_concurrency.lockutils [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.358 225859 DEBUG nova.compute.manager [None req-7c26a0e0-90f1-44de-8653-8218bf21d1dd - - - - - -] [instance: 5f56b3e9-af2b-4934-8184-6257994c6b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:09.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.472 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:47:09 np0005588919 nova_compute[225855]: 2026-01-20 14:47:09.702 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.327 225859 DEBUG nova.network.neutron [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updating instance_info_cache with network_info: [{"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.372 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.373 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Instance network_info: |[{"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.373 225859 DEBUG oslo_concurrency.lockutils [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.374 225859 DEBUG nova.network.neutron [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Refreshing network info cache for port b93181ae-8a01-468c-adfc-ec8894512d2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.377 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Start _get_guest_xml network_info=[{"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.381 225859 WARNING nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.387 225859 DEBUG nova.virt.libvirt.host [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.388 225859 DEBUG nova.virt.libvirt.host [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.394 225859 DEBUG nova.virt.libvirt.host [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.395 225859 DEBUG nova.virt.libvirt.host [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.396 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.397 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.397 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.398 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.398 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.398 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.398 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.399 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.399 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.399 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.399 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.400 225859 DEBUG nova.virt.hardware [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.402 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:10.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.641 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3117658322' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.858 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.880 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:10 np0005588919 nova_compute[225855]: 2026-01-20 14:47:10.883 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:11 np0005588919 podman[261797]: 2026-01-20 14:47:11.021772479 +0000 UTC m=+0.058812627 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:47:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3074709641' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.324 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.326 225859 DEBUG nova.virt.libvirt.vif [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1146964335',display_name='tempest-ServerActionsTestOtherA-server-1146964335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1146964335',id=100,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-pmljdz8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:06Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=7efaa6b8-d1bd-4954-83ec-adcdb8e392bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.326 225859 DEBUG nova.network.os_vif_util [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.327 225859 DEBUG nova.network.os_vif_util [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.328 225859 DEBUG nova.objects.instance [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.349 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <uuid>7efaa6b8-d1bd-4954-83ec-adcdb8e392bf</uuid>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <name>instance-00000064</name>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherA-server-1146964335</nova:name>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:47:10</nova:creationTime>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <nova:port uuid="b93181ae-8a01-468c-adfc-ec8894512d2e">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="serial">7efaa6b8-d1bd-4954-83ec-adcdb8e392bf</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="uuid">7efaa6b8-d1bd-4954-83ec-adcdb8e392bf</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:cf:63:44"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <target dev="tapb93181ae-8a"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/console.log" append="off"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:47:11 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:47:11 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:47:11 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:47:11 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.350 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Preparing to wait for external event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.350 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.350 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.350 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.351 225859 DEBUG nova.virt.libvirt.vif [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1146964335',display_name='tempest-ServerActionsTestOtherA-server-1146964335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1146964335',id=100,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-pmljdz8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:06Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=7efaa6b8-d1bd-4954-83ec-adcdb8e392bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.351 225859 DEBUG nova.network.os_vif_util [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.352 225859 DEBUG nova.network.os_vif_util [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.352 225859 DEBUG os_vif [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.352 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.353 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.353 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.355 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb93181ae-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.356 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb93181ae-8a, col_values=(('external_ids', {'iface-id': 'b93181ae-8a01-468c-adfc-ec8894512d2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:63:44', 'vm-uuid': '7efaa6b8-d1bd-4954-83ec-adcdb8e392bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.357 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588919 NetworkManager[49104]: <info>  [1768920431.3586] manager: (tapb93181ae-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.360 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.363 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.364 225859 INFO os_vif [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a')#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.417 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.418 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.418 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:cf:63:44, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.418 225859 INFO nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Using config drive#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.439 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:11.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:11 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:11Z|00340|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.656 225859 DEBUG nova.network.neutron [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updated VIF entry in instance network info cache for port b93181ae-8a01-468c-adfc-ec8894512d2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.657 225859 DEBUG nova.network.neutron [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updating instance_info_cache with network_info: [{"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.680 225859 DEBUG oslo_concurrency.lockutils [req-5c82e50f-795e-4840-a075-0fa5388da2d3 req-18b55abe-99cc-4f94-9328-0249bdee132e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.885 225859 INFO nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Creating config drive at /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config#033[00m
Jan 20 09:47:11 np0005588919 nova_compute[225855]: 2026-01-20 14:47:11.896 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhi2a2b3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.043 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhi2a2b3" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.088 225859 DEBUG nova.storage.rbd_utils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] rbd image 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.093 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.809 225859 DEBUG oslo_concurrency.processutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.716s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.810 225859 INFO nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Deleting local config drive /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf/disk.config because it was imported into RBD.#033[00m
Jan 20 09:47:12 np0005588919 kernel: tapb93181ae-8a: entered promiscuous mode
Jan 20 09:47:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:12Z|00341|binding|INFO|Claiming lport b93181ae-8a01-468c-adfc-ec8894512d2e for this chassis.
Jan 20 09:47:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:12Z|00342|binding|INFO|b93181ae-8a01-468c-adfc-ec8894512d2e: Claiming fa:16:3e:cf:63:44 10.100.0.5
Jan 20 09:47:12 np0005588919 NetworkManager[49104]: <info>  [1768920432.8646] manager: (tapb93181ae-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.863 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.869 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:63:44 10.100.0.5'], port_security=['fa:16:3e:cf:63:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7efaa6b8-d1bd-4954-83ec-adcdb8e392bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac411cec-795a-42a6-ba83-9468a87a4a14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=b93181ae-8a01-468c-adfc-ec8894512d2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.870 140354 INFO neutron.agent.ovn.metadata.agent [-] Port b93181ae-8a01-468c-adfc-ec8894512d2e in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.871 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:47:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:12Z|00343|binding|INFO|Setting lport b93181ae-8a01-468c-adfc-ec8894512d2e ovn-installed in OVS
Jan 20 09:47:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:12Z|00344|binding|INFO|Setting lport b93181ae-8a01-468c-adfc-ec8894512d2e up in Southbound
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.883 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:12 np0005588919 nova_compute[225855]: 2026-01-20 14:47:12.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.887 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8830fcf9-88dd-4f9a-8c39-77fbe6e42d8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:12 np0005588919 systemd-machined[194361]: New machine qemu-40-instance-00000064.
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.918 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e47440dd-4e29-4ab3-9cc0-cd397fd25204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:12 np0005588919 systemd[1]: Started Virtual Machine qemu-40-instance-00000064.
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.924 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[eeeea1cd-aed1-4741-8fd8-7b3405761819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:12 np0005588919 systemd-udevd[261915]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:12 np0005588919 NetworkManager[49104]: <info>  [1768920432.9420] device (tapb93181ae-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:47:12 np0005588919 NetworkManager[49104]: <info>  [1768920432.9433] device (tapb93181ae-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.958 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e59a738c-8e47-4f33-ba07-fc2e146b84aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.978 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[97e333c5-017d-475d-bc8f-f9a1fb784887]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261925, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:12.998 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7012ed3a-2097-41f6-b7d0-cce6c001c82d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261927, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261927, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:13.001 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.002 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.004 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:13.004 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:13.005 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:13.005 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:13.005 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.114 225859 DEBUG nova.compute.manager [req-e0ad3675-efdb-4cf8-ad73-05a5e0451c82 req-0e6de81c-4d63-457e-b333-f691537ae25a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.115 225859 DEBUG oslo_concurrency.lockutils [req-e0ad3675-efdb-4cf8-ad73-05a5e0451c82 req-0e6de81c-4d63-457e-b333-f691537ae25a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.116 225859 DEBUG oslo_concurrency.lockutils [req-e0ad3675-efdb-4cf8-ad73-05a5e0451c82 req-0e6de81c-4d63-457e-b333-f691537ae25a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.116 225859 DEBUG oslo_concurrency.lockutils [req-e0ad3675-efdb-4cf8-ad73-05a5e0451c82 req-0e6de81c-4d63-457e-b333-f691537ae25a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.116 225859 DEBUG nova.compute.manager [req-e0ad3675-efdb-4cf8-ad73-05a5e0451c82 req-0e6de81c-4d63-457e-b333-f691537ae25a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Processing event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.260 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.261 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920433.2600465, 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.261 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] VM Started (Lifecycle Event)#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.265 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.268 225859 INFO nova.virt.libvirt.driver [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Instance spawned successfully.#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.268 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.281 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.286 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.290 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.290 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.291 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.291 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.292 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.292 225859 DEBUG nova.virt.libvirt.driver [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.322 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.322 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920433.261002, 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.322 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.348 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.352 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920433.2645874, 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.352 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.374 225859 INFO nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Took 6.73 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.374 225859 DEBUG nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.382 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.386 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.408 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.429 225859 INFO nova.compute.manager [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Took 7.56 seconds to build instance.#033[00m
Jan 20 09:47:13 np0005588919 nova_compute[225855]: 2026-01-20 14:47:13.450 225859 DEBUG oslo_concurrency.lockutils [None req-76f1c954-bbb9-4c35-a4e2-d14ff7715e96 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:13.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:14.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:14Z|00345|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:47:14 np0005588919 nova_compute[225855]: 2026-01-20 14:47:14.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.220 225859 DEBUG nova.compute.manager [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.221 225859 DEBUG oslo_concurrency.lockutils [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.221 225859 DEBUG oslo_concurrency.lockutils [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.222 225859 DEBUG oslo_concurrency.lockutils [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.222 225859 DEBUG nova.compute.manager [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] No waiting events found dispatching network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.222 225859 WARNING nova.compute.manager [req-2b017cd5-02f6-4140-8df6-9e61646e28cf req-41fe49c6-46e3-4774-95f9-4e86508d82f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received unexpected event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e for instance with vm_state active and task_state None.#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.385 225859 DEBUG nova.compute.manager [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-changed-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.386 225859 DEBUG nova.compute.manager [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Refreshing instance network info cache due to event network-changed-b93181ae-8a01-468c-adfc-ec8894512d2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.386 225859 DEBUG oslo_concurrency.lockutils [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.386 225859 DEBUG oslo_concurrency.lockutils [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.387 225859 DEBUG nova.network.neutron [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Refreshing network info cache for port b93181ae-8a01-468c-adfc-ec8894512d2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:15.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:15 np0005588919 nova_compute[225855]: 2026-01-20 14:47:15.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:16 np0005588919 nova_compute[225855]: 2026-01-20 14:47:16.358 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:16.408 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:16.409 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:16.410 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:16.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:16 np0005588919 nova_compute[225855]: 2026-01-20 14:47:16.908 225859 DEBUG nova.network.neutron [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updated VIF entry in instance network info cache for port b93181ae-8a01-468c-adfc-ec8894512d2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:16 np0005588919 nova_compute[225855]: 2026-01-20 14:47:16.908 225859 DEBUG nova.network.neutron [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updating instance_info_cache with network_info: [{"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:16 np0005588919 nova_compute[225855]: 2026-01-20 14:47:16.943 225859 DEBUG oslo_concurrency.lockutils [req-ba5fdde9-fe78-470f-a774-bf9a14ca9719 req-e1b32a4b-73a0-4f97-bd81-fcd5f26f8ca0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:17.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:18.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:19.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:20.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:20 np0005588919 nova_compute[225855]: 2026-01-20 14:47:20.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:21 np0005588919 nova_compute[225855]: 2026-01-20 14:47:21.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:21.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:47:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 31K writes, 124K keys, 31K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.04 MB/s#012Cumulative WAL: 31K writes, 10K syncs, 2.88 writes per sync, written: 0.12 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8085 writes, 31K keys, 8085 commit groups, 1.0 writes per commit group, ingest: 34.64 MB, 0.06 MB/s#012Interval WAL: 8084 writes, 3158 syncs, 2.56 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:47:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:22.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:23 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 20 09:47:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:47:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:23 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:47:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:23.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:23 np0005588919 nova_compute[225855]: 2026-01-20 14:47:23.558 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:24.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:25 np0005588919 nova_compute[225855]: 2026-01-20 14:47:25.648 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:26 np0005588919 nova_compute[225855]: 2026-01-20 14:47:26.400 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:26.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:28.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:47:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:29.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:30 np0005588919 nova_compute[225855]: 2026-01-20 14:47:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:30 np0005588919 nova_compute[225855]: 2026-01-20 14:47:30.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:30 np0005588919 nova_compute[225855]: 2026-01-20 14:47:30.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:30.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:30 np0005588919 nova_compute[225855]: 2026-01-20 14:47:30.650 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:31 np0005588919 nova_compute[225855]: 2026-01-20 14:47:31.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:31 np0005588919 nova_compute[225855]: 2026-01-20 14:47:31.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:47:31 np0005588919 nova_compute[225855]: 2026-01-20 14:47:31.369 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:47:31 np0005588919 nova_compute[225855]: 2026-01-20 14:47:31.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:31.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:32 np0005588919 podman[262212]: 2026-01-20 14:47:32.056468863 +0000 UTC m=+0.089000152 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:47:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:32 np0005588919 nova_compute[225855]: 2026-01-20 14:47:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:32 np0005588919 nova_compute[225855]: 2026-01-20 14:47:32.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:47:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:32.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:33.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:34 np0005588919 nova_compute[225855]: 2026-01-20 14:47:34.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.519046) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454519155, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1636, "num_deletes": 254, "total_data_size": 3488221, "memory_usage": 3537944, "flush_reason": "Manual Compaction"}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454534645, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1416448, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41471, "largest_seqno": 43101, "table_properties": {"data_size": 1411216, "index_size": 2436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14662, "raw_average_key_size": 21, "raw_value_size": 1399361, "raw_average_value_size": 2031, "num_data_blocks": 108, "num_entries": 689, "num_filter_entries": 689, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920327, "oldest_key_time": 1768920327, "file_creation_time": 1768920454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 15630 microseconds, and 8246 cpu microseconds.
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.534685) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1416448 bytes OK
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.534702) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.536354) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.536371) EVENT_LOG_v1 {"time_micros": 1768920454536366, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.536448) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3480631, prev total WAL file size 3480631, number of live WAL files 2.
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.537932) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323533' seq:72057594037927935, type:22 .. '6D6772737461740031353036' seq:0, type:0; will stop at (end)
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1383KB)], [78(10MB)]
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454538016, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12678576, "oldest_snapshot_seqno": -1}
Jan 20 09:47:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:34.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6919 keys, 9676799 bytes, temperature: kUnknown
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454736717, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9676799, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9632588, "index_size": 25795, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 176842, "raw_average_key_size": 25, "raw_value_size": 9510916, "raw_average_value_size": 1374, "num_data_blocks": 1026, "num_entries": 6919, "num_filter_entries": 6919, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.736990) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9676799 bytes
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.773687) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 63.8 rd, 48.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.7 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(15.8) write-amplify(6.8) OK, records in: 7387, records dropped: 468 output_compression: NoCompression
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.773722) EVENT_LOG_v1 {"time_micros": 1768920454773711, "job": 48, "event": "compaction_finished", "compaction_time_micros": 198763, "compaction_time_cpu_micros": 50897, "output_level": 6, "num_output_files": 1, "total_output_size": 9676799, "num_input_records": 7387, "num_output_records": 6919, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454774194, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920454776099, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.537730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.776216) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.776223) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.776225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.776227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:34 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:47:34.776229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:47:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:35.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:35 np0005588919 nova_compute[225855]: 2026-01-20 14:47:35.652 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2814695511' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:36 np0005588919 nova_compute[225855]: 2026-01-20 14:47:36.404 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 20 09:47:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:36.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:37.363 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.364 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:37.365 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.419 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.419 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.420 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.421 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.421 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:37.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2919054783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:37 np0005588919 nova_compute[225855]: 2026-01-20 14:47:37.900 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.009 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.010 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.015 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.015 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.180 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.182 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4117MB free_disk=20.80602264404297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.182 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.182 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.279 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.280 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.280 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.280 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.375 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:38.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3930138932' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.845 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:38 np0005588919 nova_compute[225855]: 2026-01-20 14:47:38.852 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:39.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:40 np0005588919 nova_compute[225855]: 2026-01-20 14:47:40.082 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:40 np0005588919 nova_compute[225855]: 2026-01-20 14:47:40.110 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:47:40 np0005588919 nova_compute[225855]: 2026-01-20 14:47:40.111 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:40.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:40 np0005588919 nova_compute[225855]: 2026-01-20 14:47:40.654 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588919 nova_compute[225855]: 2026-01-20 14:47:41.106 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:41 np0005588919 nova_compute[225855]: 2026-01-20 14:47:41.106 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:47:41 np0005588919 nova_compute[225855]: 2026-01-20 14:47:41.407 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:41.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:42 np0005588919 podman[262292]: 2026-01-20 14:47:42.003659252 +0000 UTC m=+0.050605764 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:47:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:42.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:44.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:45.367 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:45.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:45 np0005588919 nova_compute[225855]: 2026-01-20 14:47:45.656 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:46 np0005588919 nova_compute[225855]: 2026-01-20 14:47:46.408 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:46.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.258 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.259 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.281 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.348 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.349 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.354 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.354 225859 INFO nova.compute.claims [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.407 225859 DEBUG nova.compute.manager [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-changed-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.407 225859 DEBUG nova.compute.manager [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing instance network info cache due to event network-changed-2c289e6f-295e-44c3-948a-9a6901251890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.407 225859 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.408 225859 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.408 225859 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Refreshing network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:47 np0005588919 nova_compute[225855]: 2026-01-20 14:47:47.471 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:47.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:48 np0005588919 nova_compute[225855]: 2026-01-20 14:47:48.320 225859 DEBUG nova.compute.manager [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:47:48 np0005588919 nova_compute[225855]: 2026-01-20 14:47:48.404 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:48.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:48 np0005588919 nova_compute[225855]: 2026-01-20 14:47:48.894 225859 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated VIF entry in instance network info cache for port 2c289e6f-295e-44c3-948a-9a6901251890. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:48 np0005588919 nova_compute[225855]: 2026-01-20 14:47:48.896 225859 DEBUG nova.network.neutron [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 20 09:47:48 np0005588919 nova_compute[225855]: 2026-01-20 14:47:48.970 225859 DEBUG oslo_concurrency.lockutils [req-fd34f402-c720-4127-83c3-6789282bb8e3 req-76a6d4c6-0dcc-4359-a4dc-0ff4cb9b16b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:49 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/551235169' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.185 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.714s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.190 225859 DEBUG nova.compute.provider_tree [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.204 225859 DEBUG nova.scheduler.client.report [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.231 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.232 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.234 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.278 225859 DEBUG nova.objects.instance [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_requests' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.302 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.303 225859 INFO nova.compute.claims [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.303 225859 DEBUG nova.objects.instance [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.314 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.315 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.318 225859 DEBUG nova.objects.instance [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'pci_devices' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.333 225859 INFO nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.374 225859 INFO nova.compute.resource_tracker [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating resource usage from migration d75f7553-0bf9-4277-b1f7-34600960db53#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.374 225859 DEBUG nova.compute.resource_tracker [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Starting to track incoming migration d75f7553-0bf9-4277-b1f7-34600960db53 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.384 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:47:49 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:47:49 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.465 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.466 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.466 225859 INFO nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating image(s)#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.488 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:49.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.509 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.530 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.534 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.594 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.595 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.596 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.596 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.625 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.630 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.662 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.690 225859 DEBUG nova.policy [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e9278fdb9e645b7938f3edb20c4d3cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.925 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.295s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:49 np0005588919 nova_compute[225855]: 2026-01-20 14:47:49.981 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] resizing rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.103 225859 DEBUG nova.objects.instance [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.116 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.116 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Ensure instance console log exists: /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.117 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.117 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.117 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:47:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/619936034' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.136 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.140 225859 DEBUG nova.compute.provider_tree [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.152 225859 DEBUG nova.scheduler.client.report [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.168 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.168 225859 INFO nova.compute.manager [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Migrating#033[00m
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.659 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:50.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:50 np0005588919 nova_compute[225855]: 2026-01-20 14:47:50.772 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Successfully created port: f4f25f14-bc59-4322-86b2-b48f096472a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:47:51 np0005588919 nova_compute[225855]: 2026-01-20 14:47:51.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:51.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:51 np0005588919 nova_compute[225855]: 2026-01-20 14:47:51.872 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Successfully updated port: f4f25f14-bc59-4322-86b2-b48f096472a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:47:51 np0005588919 nova_compute[225855]: 2026-01-20 14:47:51.908 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:51 np0005588919 nova_compute[225855]: 2026-01-20 14:47:51.908 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:51 np0005588919 nova_compute[225855]: 2026-01-20 14:47:51.908 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.047 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:47:52 np0005588919 systemd-logind[783]: New session 57 of user nova.
Jan 20 09:47:52 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:47:52 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:47:52 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:47:52 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:47:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:52 np0005588919 systemd[262580]: Queued start job for default target Main User Target.
Jan 20 09:47:52 np0005588919 systemd[262580]: Created slice User Application Slice.
Jan 20 09:47:52 np0005588919 systemd[262580]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:47:52 np0005588919 systemd[262580]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:47:52 np0005588919 systemd[262580]: Reached target Paths.
Jan 20 09:47:52 np0005588919 systemd[262580]: Reached target Timers.
Jan 20 09:47:52 np0005588919 systemd[262580]: Starting D-Bus User Message Bus Socket...
Jan 20 09:47:52 np0005588919 systemd[262580]: Starting Create User's Volatile Files and Directories...
Jan 20 09:47:52 np0005588919 systemd[262580]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:47:52 np0005588919 systemd[262580]: Reached target Sockets.
Jan 20 09:47:52 np0005588919 systemd[262580]: Finished Create User's Volatile Files and Directories.
Jan 20 09:47:52 np0005588919 systemd[262580]: Reached target Basic System.
Jan 20 09:47:52 np0005588919 systemd[262580]: Reached target Main User Target.
Jan 20 09:47:52 np0005588919 systemd[262580]: Startup finished in 160ms.
Jan 20 09:47:52 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:47:52 np0005588919 systemd[1]: Started Session 57 of User nova.
Jan 20 09:47:52 np0005588919 systemd[1]: session-57.scope: Deactivated successfully.
Jan 20 09:47:52 np0005588919 systemd-logind[783]: Session 57 logged out. Waiting for processes to exit.
Jan 20 09:47:52 np0005588919 systemd-logind[783]: Removed session 57.
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.419 225859 DEBUG nova.compute.manager [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-changed-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.419 225859 DEBUG nova.compute.manager [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Refreshing instance network info cache due to event network-changed-f4f25f14-bc59-4322-86b2-b48f096472a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.420 225859 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:47:52 np0005588919 systemd-logind[783]: New session 59 of user nova.
Jan 20 09:47:52 np0005588919 systemd[1]: Started Session 59 of User nova.
Jan 20 09:47:52 np0005588919 systemd[1]: session-59.scope: Deactivated successfully.
Jan 20 09:47:52 np0005588919 systemd-logind[783]: Session 59 logged out. Waiting for processes to exit.
Jan 20 09:47:52 np0005588919 systemd-logind[783]: Removed session 59.
Jan 20 09:47:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:52.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.896 225859 DEBUG nova.network.neutron [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Updating instance_info_cache with network_info: [{"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.915 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.915 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance network_info: |[{"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.915 225859 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.916 225859 DEBUG nova.network.neutron [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Refreshing network info cache for port f4f25f14-bc59-4322-86b2-b48f096472a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.918 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start _get_guest_xml network_info=[{"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.923 225859 WARNING nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.927 225859 DEBUG nova.virt.libvirt.host [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.928 225859 DEBUG nova.virt.libvirt.host [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.937 225859 DEBUG nova.virt.libvirt.host [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.937 225859 DEBUG nova.virt.libvirt.host [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.938 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.939 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.939 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.939 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.939 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.939 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.940 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.940 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.940 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.940 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.941 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.941 225859 DEBUG nova.virt.hardware [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:47:52 np0005588919 nova_compute[225855]: 2026-01-20 14:47:52.943 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:53.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1409179445' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:53 np0005588919 nova_compute[225855]: 2026-01-20 14:47:53.729 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.786s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:53 np0005588919 nova_compute[225855]: 2026-01-20 14:47:53.764 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:53 np0005588919 nova_compute[225855]: 2026-01-20 14:47:53.769 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:54 np0005588919 nova_compute[225855]: 2026-01-20 14:47:54.159 225859 DEBUG nova.network.neutron [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Updated VIF entry in instance network info cache for port f4f25f14-bc59-4322-86b2-b48f096472a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:47:54 np0005588919 nova_compute[225855]: 2026-01-20 14:47:54.159 225859 DEBUG nova.network.neutron [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Updating instance_info_cache with network_info: [{"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:47:54 np0005588919 nova_compute[225855]: 2026-01-20 14:47:54.175 225859 DEBUG oslo_concurrency.lockutils [req-6cfce73f-6457-47c5-be92-1044b6eabc95 req-bfa82b71-22b1-476d-89d9-d78560eed692 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d699cf6a-9c33-400b-8d0f-4d61b8b16916" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:47:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:47:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8445 writes, 43K keys, 8445 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8444 writes, 8444 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1598 writes, 7935 keys, 1598 commit groups, 1.0 writes per commit group, ingest: 15.64 MB, 0.03 MB/s#012Interval WAL: 1598 writes, 1598 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     71.3      0.72              0.19        24    0.030       0      0       0.0       0.0#012  L6      1/0    9.23 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9    107.3     88.7      2.27              0.68        23    0.099    128K    12K       0.0       0.0#012 Sum      1/0    9.23 MB   0.0      0.2     0.1      0.2       0.2      0.1       0.0   4.9     81.4     84.5      2.99              0.87        47    0.064    128K    12K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.0     82.5     81.9      0.85              0.24        12    0.071     42K   3583       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0    107.3     88.7      2.27              0.68        23    0.099    128K    12K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     71.5      0.72              0.19        23    0.031       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.050, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.08 MB/s write, 0.24 GB read, 0.08 MB/s read, 3.0 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 28.03 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000309 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1623,27.05 MB,8.89745%) FilterBlock(47,366.80 KB,0.117829%) IndexBlock(47,634.67 KB,0.203881%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:47:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:47:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2590787595' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:47:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:54.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.092 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.323s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.094 225859 DEBUG nova.virt.libvirt.vif [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-tempest.common.compute-instance-354445838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:49Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.094 225859 DEBUG nova.network.os_vif_util [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.095 225859 DEBUG nova.network.os_vif_util [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.097 225859 DEBUG nova.objects.instance [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:47:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:47:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:55.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:47:55 np0005588919 nova_compute[225855]: 2026-01-20 14:47:55.661 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.412 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.494 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <uuid>d699cf6a-9c33-400b-8d0f-4d61b8b16916</uuid>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <name>instance-00000067</name>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:name>tempest-tempest.common.compute-instance-354445838</nova:name>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:47:52</nova:creationTime>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <nova:port uuid="f4f25f14-bc59-4322-86b2-b48f096472a5">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="serial">d699cf6a-9c33-400b-8d0f-4d61b8b16916</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="uuid">d699cf6a-9c33-400b-8d0f-4d61b8b16916</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:41:b2:cf"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <target dev="tapf4f25f14-bc"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/console.log" append="off"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:47:56 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:47:56 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:47:56 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:47:56 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.495 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Preparing to wait for external event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.495 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.495 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.496 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.496 225859 DEBUG nova.virt.libvirt.vif [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-tempest.common.compute-instance-354445838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:47:49Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.496 225859 DEBUG nova.network.os_vif_util [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.497 225859 DEBUG nova.network.os_vif_util [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.497 225859 DEBUG os_vif [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.498 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.498 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.498 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.501 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.502 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4f25f14-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.502 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4f25f14-bc, col_values=(('external_ids', {'iface-id': 'f4f25f14-bc59-4322-86b2-b48f096472a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:b2:cf', 'vm-uuid': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.503 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 NetworkManager[49104]: <info>  [1768920476.5045] manager: (tapf4f25f14-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.511 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.511 225859 INFO os_vif [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc')#033[00m
Jan 20 09:47:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:56.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.695 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.695 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.696 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:41:b2:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.696 225859 INFO nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Using config drive#033[00m
Jan 20 09:47:56 np0005588919 nova_compute[225855]: 2026-01-20 14:47:56.725 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:47:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:57.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.405 225859 INFO nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating config drive at /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config#033[00m
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.410 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqeviksz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.538 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjqeviksz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:47:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:47:58.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.718 225859 DEBUG nova.storage.rbd_utils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.722 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.956 225859 DEBUG oslo_concurrency.processutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:47:58 np0005588919 nova_compute[225855]: 2026-01-20 14:47:58.957 225859 INFO nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deleting local config drive /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config because it was imported into RBD.#033[00m
Jan 20 09:47:59 np0005588919 kernel: tapf4f25f14-bc: entered promiscuous mode
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.0031] manager: (tapf4f25f14-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Jan 20 09:47:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:59Z|00346|binding|INFO|Claiming lport f4f25f14-bc59-4322-86b2-b48f096472a5 for this chassis.
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.003 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:59Z|00347|binding|INFO|f4f25f14-bc59-4322-86b2-b48f096472a5: Claiming fa:16:3e:41:b2:cf 10.100.0.12
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.015 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:b2:cf 10.100.0.12'], port_security=['fa:16:3e:41:b2:cf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e045f96f-e14f-4cbd-a987-42fc8d4d3e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f4f25f14-bc59-4322-86b2-b48f096472a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.016 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f4f25f14-bc59-4322-86b2-b48f096472a5 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.018 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:47:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:59Z|00348|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 ovn-installed in OVS
Jan 20 09:47:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:59Z|00349|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 up in Southbound
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.030 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f1cb9283-9f1c-4eba-8b0e-9efc5f211283]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.031 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:47:59 np0005588919 systemd-udevd[262739]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.033 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.033 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4c16b1-be4d-4a55-b50b-4502e1f50843]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.034 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fac6c196-5c46-4c32-bcb3-24d773246333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.046 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8d07570a-0b1c-4bfe-bff0-0914939e563c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.0485] device (tapf4f25f14-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:47:59 np0005588919 systemd-machined[194361]: New machine qemu-41-instance-00000067.
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.0492] device (tapf4f25f14-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:47:59 np0005588919 systemd[1]: Started Virtual Machine qemu-41-instance-00000067.
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.068 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3df8a98b-f465-49c4-924a-975184ad0b30]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.096 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a4da7b29-fbd3-45ea-8c76-b4eccff4498a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.099 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[65909b6b-9e5e-48ac-833c-26c2c052b17e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 systemd-udevd[262744]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.1013] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/149)
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.125 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c306acde-3320-4da9-8c0d-9f3cf2f7ad28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.128 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3d8abd-1ae7-40dc-bbbc-10d0f564de13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.1504] device (tap762e1859-40): carrier: link connected
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.158 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e041b484-3901-4804-9f0f-9823befb094d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.177 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c14d78-615a-4179-b32f-4eb42f052acc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551807, 'reachable_time': 30286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262773, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.195 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7dcdc9-a48b-4867-aebf-c851b2fa151c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 551807, 'tstamp': 551807}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262774, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.213 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e52817c3-48ed-40ad-a087-921a5f9c2807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551807, 'reachable_time': 30286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262775, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.248 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a63f90-736b-4d69-b9e8-15617bb81ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.320 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f3126656-d170-425a-950c-b2c0c34be43e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.322 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.322 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.322 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:59 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.325 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.326 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:47:59 np0005588919 NetworkManager[49104]: <info>  [1768920479.3288] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.329 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:47:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:47:59Z|00350|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.327 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.329 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[587324dc-adad-457a-89a8-a9a4334ab1a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.330 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:47:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:47:59.330 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:47:59 np0005588919 nova_compute[225855]: 2026-01-20 14:47:59.343 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:47:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:47:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:47:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:47:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:47:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 20 09:47:59 np0005588919 podman[262808]: 2026-01-20 14:47:59.669590537 +0000 UTC m=+0.026011398 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:48:00 np0005588919 podman[262808]: 2026-01-20 14:48:00.04578011 +0000 UTC m=+0.402200951 container create f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:48:00 np0005588919 systemd[1]: Started libpod-conmon-f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322.scope.
Jan 20 09:48:00 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:48:00 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe49cb91db3dea07b646b62647ce33597562542cda25b9855d3f6435dcf0497/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.259 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920480.2590883, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.260 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Started (Lifecycle Event)#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.285 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.289 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920480.2603076, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.289 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:48:00 np0005588919 podman[262808]: 2026-01-20 14:48:00.30244663 +0000 UTC m=+0.658867491 container init f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:48:00 np0005588919 podman[262808]: 2026-01-20 14:48:00.308029618 +0000 UTC m=+0.664450449 container start f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.317 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.322 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [NOTICE]   (262869) : New worker (262871) forked
Jan 20 09:48:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [NOTICE]   (262869) : Loading success.
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.343 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.667 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:00.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.903 225859 DEBUG nova.compute.manager [req-014a5a36-5b42-42fc-9b45-5954c5122c8d req-ca6620d1-5aa1-44ec-84ec-7a550439409a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.904 225859 DEBUG oslo_concurrency.lockutils [req-014a5a36-5b42-42fc-9b45-5954c5122c8d req-ca6620d1-5aa1-44ec-84ec-7a550439409a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.904 225859 DEBUG oslo_concurrency.lockutils [req-014a5a36-5b42-42fc-9b45-5954c5122c8d req-ca6620d1-5aa1-44ec-84ec-7a550439409a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.904 225859 DEBUG oslo_concurrency.lockutils [req-014a5a36-5b42-42fc-9b45-5954c5122c8d req-ca6620d1-5aa1-44ec-84ec-7a550439409a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.904 225859 DEBUG nova.compute.manager [req-014a5a36-5b42-42fc-9b45-5954c5122c8d req-ca6620d1-5aa1-44ec-84ec-7a550439409a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Processing event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.905 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.909 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920480.9085298, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.909 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.911 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.914 225859 INFO nova.virt.libvirt.driver [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance spawned successfully.#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.915 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.941 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.946 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.946 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.947 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.947 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.947 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.948 225859 DEBUG nova.virt.libvirt.driver [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:00 np0005588919 nova_compute[225855]: 2026-01-20 14:48:00.953 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.001 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.019 225859 INFO nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Took 11.55 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.020 225859 DEBUG nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.089 225859 INFO nova.compute.manager [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Took 13.77 seconds to build instance.#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.130 225859 DEBUG oslo_concurrency.lockutils [None req-72afa0d3-f22b-4916-8bef-62f9235254a3 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:01 np0005588919 nova_compute[225855]: 2026-01-20 14:48:01.504 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:48:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:48:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:02.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:02 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:48:02 np0005588919 systemd[262580]: Activating special unit Exit the Session...
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped target Main User Target.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped target Basic System.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped target Paths.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped target Sockets.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped target Timers.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:48:02 np0005588919 systemd[262580]: Closed D-Bus User Message Bus Socket.
Jan 20 09:48:02 np0005588919 systemd[262580]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:48:02 np0005588919 systemd[262580]: Removed slice User Application Slice.
Jan 20 09:48:02 np0005588919 systemd[262580]: Reached target Shutdown.
Jan 20 09:48:02 np0005588919 systemd[262580]: Finished Exit the Session.
Jan 20 09:48:02 np0005588919 systemd[262580]: Reached target Exit the Session.
Jan 20 09:48:02 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:48:02 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:48:02 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:48:02 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:48:02 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:48:02 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:48:02 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:48:02 np0005588919 podman[262881]: 2026-01-20 14:48:02.805802094 +0000 UTC m=+0.098545983 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.187 225859 DEBUG nova.compute.manager [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.187 225859 DEBUG oslo_concurrency.lockutils [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.188 225859 DEBUG oslo_concurrency.lockutils [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.188 225859 DEBUG oslo_concurrency.lockutils [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.188 225859 DEBUG nova.compute.manager [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:03 np0005588919 nova_compute[225855]: 2026-01-20 14:48:03.188 225859 WARNING nova.compute.manager [req-d5714cd1-b349-4a0e-9b21-8c8b9ede36a6 req-c79947aa-008a-4104-a3c1-2be21c7910e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received unexpected event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:48:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:03.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:04.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:04 np0005588919 nova_compute[225855]: 2026-01-20 14:48:04.837 225859 INFO nova.compute.manager [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Rebuilding instance#033[00m
Jan 20 09:48:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:04Z|00351|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:48:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:04Z|00352|binding|INFO|Releasing lport 5527ab8d-a985-420b-9d5b-7e5d9baf7004 from this chassis (sb_readonly=0)
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.033 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.192 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'trusted_certs' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.217 225859 DEBUG nova.compute.manager [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.276 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_requests' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.304 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.318 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.331 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.342 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.345 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:48:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:05.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:05 np0005588919 nova_compute[225855]: 2026-01-20 14:48:05.670 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.506 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:06.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.859 225859 DEBUG nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.859 225859 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.860 225859 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.860 225859 DEBUG oslo_concurrency.lockutils [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.860 225859 DEBUG nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:06 np0005588919 nova_compute[225855]: 2026-01-20 14:48:06.860 225859 WARNING nova.compute.manager [req-097c2d3f-04a2-477f-be4d-2ec10e58f733 req-0fac2271-8ace-4584-b943-f32d437f62b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:48:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:07.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:08.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:08 np0005588919 nova_compute[225855]: 2026-01-20 14:48:08.686 225859 INFO nova.network.neutron [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating port efc8b363-e70d-42f6-9be8-99865e269ec9 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.001 225859 DEBUG nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.002 225859 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.002 225859 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.003 225859 DEBUG oslo_concurrency.lockutils [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.003 225859 DEBUG nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.003 225859 WARNING nova.compute.manager [req-e20e56d3-b184-482a-bf1c-4bef38a10f5b req-a76ed729-c047-4567-b166-ff20c00ca3d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:48:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:48:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:09.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.849 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.850 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:09 np0005588919 nova_compute[225855]: 2026-01-20 14:48:09.850 225859 DEBUG nova.network.neutron [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:48:10 np0005588919 nova_compute[225855]: 2026-01-20 14:48:10.672 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:10.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:11 np0005588919 nova_compute[225855]: 2026-01-20 14:48:11.091 225859 DEBUG nova.compute.manager [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:11 np0005588919 nova_compute[225855]: 2026-01-20 14:48:11.091 225859 DEBUG nova.compute.manager [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing instance network info cache due to event network-changed-efc8b363-e70d-42f6-9be8-99865e269ec9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:48:11 np0005588919 nova_compute[225855]: 2026-01-20 14:48:11.092 225859 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:11 np0005588919 nova_compute[225855]: 2026-01-20 14:48:11.508 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:11.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:11 np0005588919 nova_compute[225855]: 2026-01-20 14:48:11.977 225859 DEBUG nova.network.neutron [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.012 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.015 225859 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.015 225859 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Refreshing network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.078 225859 DEBUG os_brick.utils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.080 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.092 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.093 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[be49b86a-def5-4b33-a828-844319f1c63e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.094 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.102 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.102 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[3497dad6-62db-4442-bcf2-7aa93873de6a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.104 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.112 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.112 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[9d21fe2f-fee5-429e-822f-b4d6c6b13d60]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.114 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1a4dfa-1409-4419-a336-b71a38e101d2]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.114 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.140 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.143 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.143 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.143 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:48:12 np0005588919 nova_compute[225855]: 2026-01-20 14:48:12.144 225859 DEBUG os_brick.utils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] <== get_connector_properties: return (64ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:48:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:12.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:13 np0005588919 podman[262973]: 2026-01-20 14:48:13.020774774 +0000 UTC m=+0.062404953 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:48:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:13.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.556 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.558 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.558 225859 INFO nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Creating image(s)#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.559 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.559 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Ensure instance console log exists: /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.559 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.560 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.560 225859 DEBUG oslo_concurrency.lockutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.562 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start _get_guest_xml network_info=[{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9219aafd-6c66-4f38-9927-85b54b4175ae', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9219aafd-6c66-4f38-9927-85b54b4175ae', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'attached_at': '2026-01-20T14:48:13.000000', 'detached_at': '', 'volume_id': '9219aafd-6c66-4f38-9927-85b54b4175ae', 'serial': '9219aafd-6c66-4f38-9927-85b54b4175ae'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'a2de2d41-d2d4-4195-90f3-7c5d6054f339', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.565 225859 WARNING nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.572 225859 DEBUG nova.virt.libvirt.host [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.573 225859 DEBUG nova.virt.libvirt.host [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.576 225859 DEBUG nova.virt.libvirt.host [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.576 225859 DEBUG nova.virt.libvirt.host [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.577 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.578 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.578 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.578 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.579 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.579 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.579 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.579 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.580 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.580 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.580 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.580 225859 DEBUG nova.virt.hardware [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.580 225859 DEBUG nova.objects.instance [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.625 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.910 225859 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updated VIF entry in instance network info cache for port efc8b363-e70d-42f6-9be8-99865e269ec9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.911 225859 DEBUG nova.network.neutron [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [{"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:13 np0005588919 nova_compute[225855]: 2026-01-20 14:48:13.943 225859 DEBUG oslo_concurrency.lockutils [req-ef09fc88-af27-40ef-880d-cee8107b5ee9 req-33809d9c-85ae-465c-82b7-2ba2a4cbd952 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-9beb3ec3-721e-4919-9713-a92c82ad189b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3187814475' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.059 225859 DEBUG oslo_concurrency.processutils [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.083 225859 DEBUG nova.virt.libvirt.vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.084 225859 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.084 225859 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.087 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <uuid>9beb3ec3-721e-4919-9713-a92c82ad189b</uuid>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <name>instance-00000065</name>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherA-server-757916410</nova:name>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:48:13</nova:creationTime>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:user uuid="869086208e10436c9dc96c78bee9a85d">tempest-ServerActionsTestOtherA-967087071-project-member</nova:user>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:project uuid="b683fcc0026242e28ba6d8fba638688e">tempest-ServerActionsTestOtherA-967087071</nova:project>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <nova:port uuid="efc8b363-e70d-42f6-9be8-99865e269ec9">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="serial">9beb3ec3-721e-4919-9713-a92c82ad189b</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="uuid">9beb3ec3-721e-4919-9713-a92c82ad189b</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/9beb3ec3-721e-4919-9713-a92c82ad189b_disk.config">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-9219aafd-6c66-4f38-9927-85b54b4175ae">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <serial>9219aafd-6c66-4f38-9927-85b54b4175ae</serial>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:36:66:1d"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <target dev="tapefc8b363-e7"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b/console.log" append="off"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:48:14 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:48:14 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:48:14 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:48:14 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.088 225859 DEBUG nova.virt.libvirt.vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.089 225859 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-916311998-network", "vif_mac": "fa:16:3e:36:66:1d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.089 225859 DEBUG nova.network.os_vif_util [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.089 225859 DEBUG os_vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.090 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.091 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.093 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefc8b363-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.094 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefc8b363-e7, col_values=(('external_ids', {'iface-id': 'efc8b363-e70d-42f6-9be8-99865e269ec9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:66:1d', 'vm-uuid': '9beb3ec3-721e-4919-9713-a92c82ad189b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.095 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 NetworkManager[49104]: <info>  [1768920494.0964] manager: (tapefc8b363-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.103 225859 INFO os_vif [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7')#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.161 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.161 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.162 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] No VIF found with MAC fa:16:3e:36:66:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.162 225859 INFO nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Using config drive#033[00m
Jan 20 09:48:14 np0005588919 kernel: tapefc8b363-e7: entered promiscuous mode
Jan 20 09:48:14 np0005588919 NetworkManager[49104]: <info>  [1768920494.2506] manager: (tapefc8b363-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 20 09:48:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:14Z|00353|binding|INFO|Claiming lport efc8b363-e70d-42f6-9be8-99865e269ec9 for this chassis.
Jan 20 09:48:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:14Z|00354|binding|INFO|efc8b363-e70d-42f6-9be8-99865e269ec9: Claiming fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.261 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:66:1d 10.100.0.8'], port_security=['fa:16:3e:36:66:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=efc8b363-e70d-42f6-9be8-99865e269ec9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.264 140354 INFO neutron.agent.ovn.metadata.agent [-] Port efc8b363-e70d-42f6-9be8-99865e269ec9 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 bound to our chassis#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.266 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:48:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:14Z|00355|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 ovn-installed in OVS
Jan 20 09:48:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:14Z|00356|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 up in Southbound
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.275 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.285 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef20bc6-0f59-4f68-9da9-6e7bfe7103ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 systemd-machined[194361]: New machine qemu-42-instance-00000065.
Jan 20 09:48:14 np0005588919 systemd-udevd[263067]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:48:14 np0005588919 systemd[1]: Started Virtual Machine qemu-42-instance-00000065.
Jan 20 09:48:14 np0005588919 NetworkManager[49104]: <info>  [1768920494.3070] device (tapefc8b363-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:48:14 np0005588919 NetworkManager[49104]: <info>  [1768920494.3078] device (tapefc8b363-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.324 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5d78ae86-e771-49c0-8883-2e5aae495642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.329 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c01fadcf-3a79-4e42-8025-bba70cb6226b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.357 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5c26abcb-8f6b-4348-b983-f1d34cf0a41d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.377 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[61fae8ab-7d9b-4f23-a0e8-e5d1998c0040]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263079, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.396 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ba799d20-2cc9-4cc0-933b-029e44808f3c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263081, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263081, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.398 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.399 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.401 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.401 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.402 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:14.402 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.538481) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494538529, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 699, "num_deletes": 258, "total_data_size": 1110707, "memory_usage": 1129176, "flush_reason": "Manual Compaction"}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494545839, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 732123, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43107, "largest_seqno": 43800, "table_properties": {"data_size": 728727, "index_size": 1240, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8094, "raw_average_key_size": 19, "raw_value_size": 721713, "raw_average_value_size": 1702, "num_data_blocks": 55, "num_entries": 424, "num_filter_entries": 424, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920455, "oldest_key_time": 1768920455, "file_creation_time": 1768920494, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 7432 microseconds, and 2913 cpu microseconds.
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.545907) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 732123 bytes OK
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.545930) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.547913) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.547929) EVENT_LOG_v1 {"time_micros": 1768920494547924, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.547951) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1106864, prev total WAL file size 1106864, number of live WAL files 2.
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.548510) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323631' seq:72057594037927935, type:22 .. '6C6F676D0031353134' seq:0, type:0; will stop at (end)
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(714KB)], [81(9449KB)]
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494548538, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 10408922, "oldest_snapshot_seqno": -1}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6814 keys, 10279653 bytes, temperature: kUnknown
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494635640, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 10279653, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10235100, "index_size": 26393, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17093, "raw_key_size": 175705, "raw_average_key_size": 25, "raw_value_size": 10114193, "raw_average_value_size": 1484, "num_data_blocks": 1049, "num_entries": 6814, "num_filter_entries": 6814, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920494, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.635947) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 10279653 bytes
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.638556) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 119.3 rd, 117.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(28.3) write-amplify(14.0) OK, records in: 7343, records dropped: 529 output_compression: NoCompression
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.638579) EVENT_LOG_v1 {"time_micros": 1768920494638569, "job": 50, "event": "compaction_finished", "compaction_time_micros": 87220, "compaction_time_cpu_micros": 29583, "output_level": 6, "num_output_files": 1, "total_output_size": 10279653, "num_input_records": 7343, "num_output_records": 6814, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494639007, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920494641053, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.548399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.641123) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.641127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.641130) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.641131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:14.641133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:14.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.710 225859 DEBUG nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.714 225859 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.714 225859 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.715 225859 DEBUG oslo_concurrency.lockutils [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.715 225859 DEBUG nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:14 np0005588919 nova_compute[225855]: 2026-01-20 14:48:14.715 225859 WARNING nova.compute.manager [req-52bf3fba-92ea-4c19-a011-4c91a4e991d1 req-bddb78a8-ee80-4baf-b8cf-d6ceb0db8bb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.096 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920495.0961964, 9beb3ec3-721e-4919-9713-a92c82ad189b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.098 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.101 225859 DEBUG nova.compute.manager [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.104 225859 INFO nova.virt.libvirt.driver [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance running successfully.#033[00m
Jan 20 09:48:15 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.108 225859 DEBUG nova.virt.libvirt.guest [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.108 225859 DEBUG nova.virt.libvirt.driver [None req-4d25a047-bfcc-47e3-8e2e-07a0e595cea9 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.128 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.132 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.161 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.162 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920495.0972342, 9beb3ec3-721e-4919-9713-a92c82ad189b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.162 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Started (Lifecycle Event)#033[00m
Jan 20 09:48:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:15Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:b2:cf 10.100.0.12
Jan 20 09:48:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:15Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:b2:cf 10.100.0.12
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.186 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.190 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.210 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.391 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.463 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:15.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.584 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:15.585 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:15.586 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:48:15 np0005588919 nova_compute[225855]: 2026-01-20 14:48:15.674 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:16.409 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:16.410 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:16.411 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:16.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.463 225859 DEBUG nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.463 225859 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.464 225859 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.464 225859 DEBUG oslo_concurrency.lockutils [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.464 225859 DEBUG nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.464 225859 WARNING nova.compute.manager [req-01de2e77-9b49-4448-b97a-9c6cba352038 req-93fc5194-f820-4253-a593-05bfd11c4a53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:48:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:17.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:17 np0005588919 kernel: tapf4f25f14-bc (unregistering): left promiscuous mode
Jan 20 09:48:17 np0005588919 NetworkManager[49104]: <info>  [1768920497.6692] device (tapf4f25f14-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:17Z|00357|binding|INFO|Releasing lport f4f25f14-bc59-4322-86b2-b48f096472a5 from this chassis (sb_readonly=0)
Jan 20 09:48:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:17Z|00358|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 down in Southbound
Jan 20 09:48:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:17Z|00359|binding|INFO|Removing iface tapf4f25f14-bc ovn-installed in OVS
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.683 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.686 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.693 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:b2:cf 10.100.0.12'], port_security=['fa:16:3e:41:b2:cf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e045f96f-e14f-4cbd-a987-42fc8d4d3e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f4f25f14-bc59-4322-86b2-b48f096472a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.694 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f4f25f14-bc59-4322-86b2-b48f096472a5 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.696 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.697 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[58416d86-89ad-4367-a830-0747ec0a4064]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.698 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.702 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 20 09:48:17 np0005588919 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000067.scope: Consumed 14.332s CPU time.
Jan 20 09:48:17 np0005588919 systemd-machined[194361]: Machine qemu-41-instance-00000067 terminated.
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [NOTICE]   (262869) : haproxy version is 2.8.14-c23fe91
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [NOTICE]   (262869) : path to executable is /usr/sbin/haproxy
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [WARNING]  (262869) : Exiting Master process...
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [WARNING]  (262869) : Exiting Master process...
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [ALERT]    (262869) : Current worker (262871) exited with code 143 (Terminated)
Jan 20 09:48:17 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[262861]: [WARNING]  (262869) : All workers exited. Exiting... (0)
Jan 20 09:48:17 np0005588919 systemd[1]: libpod-f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322.scope: Deactivated successfully.
Jan 20 09:48:17 np0005588919 podman[263149]: 2026-01-20 14:48:17.849121383 +0000 UTC m=+0.048565342 container died f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:48:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322-userdata-shm.mount: Deactivated successfully.
Jan 20 09:48:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay-abe49cb91db3dea07b646b62647ce33597562542cda25b9855d3f6435dcf0497-merged.mount: Deactivated successfully.
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.891 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.897 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 podman[263149]: 2026-01-20 14:48:17.89966452 +0000 UTC m=+0.099108479 container cleanup f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:48:17 np0005588919 systemd[1]: libpod-conmon-f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322.scope: Deactivated successfully.
Jan 20 09:48:17 np0005588919 podman[263190]: 2026-01-20 14:48:17.973805623 +0000 UTC m=+0.053525952 container remove f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.982 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8888409a-042f-453f-962e-38f354a9df27]: (4, ('Tue Jan 20 02:48:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322)\nf664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322\nTue Jan 20 02:48:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (f664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322)\nf664c1788de7c7e153bea50ec21bff6e40c26e5e54caf93495340481c02af322\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.985 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29be0599-1bd5-4e2e-add3-087564f27654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:17.986 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:17 np0005588919 nova_compute[225855]: 2026-01-20 14:48:17.987 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:17 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.003 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.007 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3941797a-32b6-458e-9037-45817e7152d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.020 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[016dc9b4-e59e-4774-b8eb-091ca7623898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.022 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a44ba73b-01bb-45c3-8754-b69febab8ff8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.040 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[677a5b2f-730c-4e62-91d2-0878c91e7194]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551801, 'reachable_time': 20664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263209, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:18 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.043 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:48:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:18.043 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[541fd44b-c3ff-49b6-8858-0a5dfc505d09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.405 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.410 225859 INFO nova.virt.libvirt.driver [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance destroyed successfully.#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.415 225859 INFO nova.virt.libvirt.driver [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance destroyed successfully.#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.416 225859 DEBUG nova.virt.libvirt.vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-ServerActionsTestJSON-server-583331137',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:48:04Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.416 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.417 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.417 225859 DEBUG os_vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.419 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4f25f14-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.421 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.423 225859 INFO os_vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc')#033[00m
Jan 20 09:48:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:18.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.792 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deleting instance files /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916_del#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.794 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deletion of /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916_del complete#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.994 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:48:18 np0005588919 nova_compute[225855]: 2026-01-20 14:48:18.994 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating image(s)#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.014 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.034 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.057 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.061 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.143 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.144 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.145 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.145 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.174 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.178 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:19.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.743 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.806 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] resizing rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.924 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.925 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Ensure instance console log exists: /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.925 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.925 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.926 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.928 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start _get_guest_xml network_info=[{"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.934 225859 WARNING nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.939 225859 DEBUG nova.compute.manager [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.939 225859 DEBUG oslo_concurrency.lockutils [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.939 225859 DEBUG oslo_concurrency.lockutils [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.940 225859 DEBUG oslo_concurrency.lockutils [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.940 225859 DEBUG nova.compute.manager [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.940 225859 WARNING nova.compute.manager [req-da530277-c926-4503-ad7b-cc756476fb6a req-44f41c41-64c2-4895-90af-3ff5915019a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received unexpected event network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.945 225859 DEBUG nova.virt.libvirt.host [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.946 225859 DEBUG nova.virt.libvirt.host [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.948 225859 DEBUG nova.virt.libvirt.host [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.948 225859 DEBUG nova.virt.libvirt.host [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.950 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.950 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.950 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.950 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.951 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.951 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.951 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.951 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.952 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.952 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.952 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.953 225859 DEBUG nova.virt.hardware [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.953 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:19 np0005588919 nova_compute[225855]: 2026-01-20 14:48:19.981 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:48:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1269976554' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.399 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.423 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.426 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.677 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:20.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:48:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3194524310' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.928 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.930 225859 DEBUG nova.virt.libvirt.vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-ServerActionsTestJSON-server-583331137',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:48:18Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.930 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.931 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.936 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <uuid>d699cf6a-9c33-400b-8d0f-4d61b8b16916</uuid>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <name>instance-00000067</name>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestJSON-server-583331137</nova:name>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:48:19</nova:creationTime>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <nova:port uuid="f4f25f14-bc59-4322-86b2-b48f096472a5">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="serial">d699cf6a-9c33-400b-8d0f-4d61b8b16916</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="uuid">d699cf6a-9c33-400b-8d0f-4d61b8b16916</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:41:b2:cf"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <target dev="tapf4f25f14-bc"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/console.log" append="off"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:48:20 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:48:20 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:48:20 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:48:20 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.938 225859 DEBUG nova.compute.manager [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Preparing to wait for external event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.938 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.938 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.939 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.940 225859 DEBUG nova.virt.libvirt.vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-ServerActionsTestJSON-server-583331137',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:48:18Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.940 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.941 225859 DEBUG nova.network.os_vif_util [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.941 225859 DEBUG os_vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.942 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.942 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.943 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.945 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4f25f14-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4f25f14-bc, col_values=(('external_ids', {'iface-id': 'f4f25f14-bc59-4322-86b2-b48f096472a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:b2:cf', 'vm-uuid': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:20 np0005588919 NetworkManager[49104]: <info>  [1768920500.9490] manager: (tapf4f25f14-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:20 np0005588919 nova_compute[225855]: 2026-01-20 14:48:20.956 225859 INFO os_vif [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc')#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.027 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.027 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.028 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:41:b2:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.028 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Using config drive#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.054 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.085 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'ec2_ids' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:21 np0005588919 nova_compute[225855]: 2026-01-20 14:48:21.178 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'keypairs' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:21.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.097 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Creating config drive at /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.101 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpixw68mkq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.244 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpixw68mkq" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.273 225859 DEBUG nova.storage.rbd_utils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.279 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.475 225859 DEBUG oslo_concurrency.processutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config d699cf6a-9c33-400b-8d0f-4d61b8b16916_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.196s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.476 225859 INFO nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deleting local config drive /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916/disk.config because it was imported into RBD.#033[00m
Jan 20 09:48:22 np0005588919 kernel: tapf4f25f14-bc: entered promiscuous mode
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.5237] manager: (tapf4f25f14-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.527 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:22Z|00360|binding|INFO|Claiming lport f4f25f14-bc59-4322-86b2-b48f096472a5 for this chassis.
Jan 20 09:48:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:22Z|00361|binding|INFO|f4f25f14-bc59-4322-86b2-b48f096472a5: Claiming fa:16:3e:41:b2:cf 10.100.0.12
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.541 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:b2:cf 10.100.0.12'], port_security=['fa:16:3e:41:b2:cf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e045f96f-e14f-4cbd-a987-42fc8d4d3e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f4f25f14-bc59-4322-86b2-b48f096472a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.542 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f4f25f14-bc59-4322-86b2-b48f096472a5 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.544 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:48:22 np0005588919 systemd-udevd[263532]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:48:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:22Z|00362|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 ovn-installed in OVS
Jan 20 09:48:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:22Z|00363|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 up in Southbound
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.553 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.555 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb09e0d-5933-43b8-91b6-5a39745f175d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.557 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:48:22 np0005588919 systemd-machined[194361]: New machine qemu-43-instance-00000067.
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.563 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.563 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6dddbacd-3131-4f72-8495-831676b71902]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.564 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4bdb1c-fa3e-4b63-9dda-933903a0a56f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.5654] device (tapf4f25f14-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.5665] device (tapf4f25f14-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:48:22 np0005588919 systemd[1]: Started Virtual Machine qemu-43-instance-00000067.
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.577 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3f890100-f78f-4aa9-af39-7303d10e9cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.582 225859 DEBUG nova.compute.manager [req-4f314d23-7114-4566-a35c-a3935b22e8fa req-9627b464-fec3-428f-a19b-2d9a5f7c4690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.582 225859 DEBUG oslo_concurrency.lockutils [req-4f314d23-7114-4566-a35c-a3935b22e8fa req-9627b464-fec3-428f-a19b-2d9a5f7c4690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.583 225859 DEBUG oslo_concurrency.lockutils [req-4f314d23-7114-4566-a35c-a3935b22e8fa req-9627b464-fec3-428f-a19b-2d9a5f7c4690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.583 225859 DEBUG oslo_concurrency.lockutils [req-4f314d23-7114-4566-a35c-a3935b22e8fa req-9627b464-fec3-428f-a19b-2d9a5f7c4690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.583 225859 DEBUG nova.compute.manager [req-4f314d23-7114-4566-a35c-a3935b22e8fa req-9627b464-fec3-428f-a19b-2d9a5f7c4690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Processing event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.602 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6692b38e-d57a-4704-adef-73e535d33538]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.630 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d6aa10-8a59-40f5-b178-313d94ba924c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 systemd-udevd[263537]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.6406] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.637 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[467f107f-e409-4e9c-ba1f-c9802f1f7b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.672 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[daff0557-be31-47f9-98bb-96e0b1466b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.676 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[53a81891-3d4b-4390-aaa1-ae4db0c1d8c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.6999] device (tap762e1859-40): carrier: link connected
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.704 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[60f72cb6-5d07-4f1c-a7b1-eef00f6b5477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.725 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6bdbaf-a95b-43ae-90e8-3dff8a370131]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554162, 'reachable_time': 23605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263566, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.740 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[91a5f0b6-13d6-4910-a7bb-909b80a852c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554162, 'tstamp': 554162}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263567, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.760 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9c779c1f-2b5c-4518-8fe6-1305a6b1b5b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554162, 'reachable_time': 23605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263568, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.786 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c7cece-ef0e-4660-98a5-6ba270f7456f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.846 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45dc11e0-1844-49aa-ba68-7c171b528e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.848 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.848 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.850 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:22 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.851 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:22 np0005588919 NetworkManager[49104]: <info>  [1768920502.8524] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.857 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:22 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:22Z|00364|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.862 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.863 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9426ba-1304-4f28-ad08-3761d5cf7497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.864 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:48:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:22.867 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:48:22 np0005588919 nova_compute[225855]: 2026-01-20 14:48:22.880 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.076 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for d699cf6a-9c33-400b-8d0f-4d61b8b16916 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.077 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920503.0762124, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.078 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Started (Lifecycle Event)#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.081 225859 DEBUG nova.compute.manager [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.107 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.111 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.114 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.118 225859 INFO nova.virt.libvirt.driver [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance spawned successfully.#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.118 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.150 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.151 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.151 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.152 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.152 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.153 225859 DEBUG nova.virt.libvirt.driver [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.158 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.158 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920503.080373, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.160 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.192 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.196 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920503.0866337, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.196 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.227 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.231 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.264 225859 DEBUG nova.compute.manager [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.266 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 09:48:23 np0005588919 podman[263642]: 2026-01-20 14:48:23.276571476 +0000 UTC m=+0.043955902 container create ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:23 np0005588919 systemd[1]: Started libpod-conmon-ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8.scope.
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.330 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.335 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.336 225859 DEBUG nova.objects.instance [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 09:48:23 np0005588919 podman[263642]: 2026-01-20 14:48:23.253071113 +0000 UTC m=+0.020455559 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:48:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:48:23 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d891a9ccda42ab217a336e792c976565d831731c01eb0b300ac8c6c159413fa7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:48:23 np0005588919 podman[263642]: 2026-01-20 14:48:23.376364974 +0000 UTC m=+0.143749410 container init ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:23 np0005588919 podman[263642]: 2026-01-20 14:48:23.383292779 +0000 UTC m=+0.150677205 container start ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:23 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [NOTICE]   (263661) : New worker (263663) forked
Jan 20 09:48:23 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [NOTICE]   (263661) : Loading success.
Jan 20 09:48:23 np0005588919 nova_compute[225855]: 2026-01-20 14:48:23.428 225859 DEBUG oslo_concurrency.lockutils [None req-57a9ea68-ec8a-451a-b907-3ce30cfe2680 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:23.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:24.589 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:24.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.714 225859 DEBUG nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.715 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.715 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.716 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.716 225859 DEBUG nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.716 225859 WARNING nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received unexpected event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.717 225859 DEBUG nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.717 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.717 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.718 225859 DEBUG oslo_concurrency.lockutils [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.718 225859 DEBUG nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:24 np0005588919 nova_compute[225855]: 2026-01-20 14:48:24.718 225859 WARNING nova.compute.manager [req-21e407b4-2abf-4807-b912-78d5a33ce861 req-74399323-fab2-4c07-a5a8-57ca63e698d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received unexpected event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:48:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:25.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:25 np0005588919 nova_compute[225855]: 2026-01-20 14:48:25.679 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:25 np0005588919 nova_compute[225855]: 2026-01-20 14:48:25.948 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.545 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.545 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.546 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.546 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.546 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.547 225859 INFO nova.compute.manager [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Terminating instance#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.548 225859 DEBUG nova.compute.manager [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:48:26 np0005588919 kernel: tapf4f25f14-bc (unregistering): left promiscuous mode
Jan 20 09:48:26 np0005588919 NetworkManager[49104]: <info>  [1768920506.5909] device (tapf4f25f14-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.598 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:26Z|00365|binding|INFO|Releasing lport f4f25f14-bc59-4322-86b2-b48f096472a5 from this chassis (sb_readonly=0)
Jan 20 09:48:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:26Z|00366|binding|INFO|Setting lport f4f25f14-bc59-4322-86b2-b48f096472a5 down in Southbound
Jan 20 09:48:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:26Z|00367|binding|INFO|Removing iface tapf4f25f14-bc ovn-installed in OVS
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.604 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:b2:cf 10.100.0.12'], port_security=['fa:16:3e:41:b2:cf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd699cf6a-9c33-400b-8d0f-4d61b8b16916', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e045f96f-e14f-4cbd-a987-42fc8d4d3e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f4f25f14-bc59-4322-86b2-b48f096472a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.605 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f4f25f14-bc59-4322-86b2-b48f096472a5 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.607 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.609 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1e52f0-ce42-428a-90b4-5454cb311919]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.609 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.617 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000067.scope: Deactivated successfully.
Jan 20 09:48:26 np0005588919 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000067.scope: Consumed 4.055s CPU time.
Jan 20 09:48:26 np0005588919 systemd-machined[194361]: Machine qemu-43-instance-00000067 terminated.
Jan 20 09:48:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:26.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [NOTICE]   (263661) : haproxy version is 2.8.14-c23fe91
Jan 20 09:48:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [NOTICE]   (263661) : path to executable is /usr/sbin/haproxy
Jan 20 09:48:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [WARNING]  (263661) : Exiting Master process...
Jan 20 09:48:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [ALERT]    (263661) : Current worker (263663) exited with code 143 (Terminated)
Jan 20 09:48:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[263657]: [WARNING]  (263661) : All workers exited. Exiting... (0)
Jan 20 09:48:26 np0005588919 systemd[1]: libpod-ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8.scope: Deactivated successfully.
Jan 20 09:48:26 np0005588919 podman[263697]: 2026-01-20 14:48:26.729441185 +0000 UTC m=+0.040479364 container died ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:48:26 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8-userdata-shm.mount: Deactivated successfully.
Jan 20 09:48:26 np0005588919 systemd[1]: var-lib-containers-storage-overlay-d891a9ccda42ab217a336e792c976565d831731c01eb0b300ac8c6c159413fa7-merged.mount: Deactivated successfully.
Jan 20 09:48:26 np0005588919 podman[263697]: 2026-01-20 14:48:26.769524026 +0000 UTC m=+0.080562195 container cleanup ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.780 225859 INFO nova.virt.libvirt.driver [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Instance destroyed successfully.#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.781 225859 DEBUG nova.objects.instance [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid d699cf6a-9c33-400b-8d0f-4d61b8b16916 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:26 np0005588919 systemd[1]: libpod-conmon-ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8.scope: Deactivated successfully.
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.812 225859 DEBUG nova.virt.libvirt.vif [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:47:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-354445838',display_name='tempest-ServerActionsTestJSON-server-583331137',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-354445838',id=103,image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-2qck85q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='26699514-f465-4b50-98b7-36f2cfc6a308',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:23Z,user_data=None,user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=d699cf6a-9c33-400b-8d0f-4d61b8b16916,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.812 225859 DEBUG nova.network.os_vif_util [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "f4f25f14-bc59-4322-86b2-b48f096472a5", "address": "fa:16:3e:41:b2:cf", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4f25f14-bc", "ovs_interfaceid": "f4f25f14-bc59-4322-86b2-b48f096472a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.813 225859 DEBUG nova.network.os_vif_util [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.813 225859 DEBUG os_vif [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.816 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4f25f14-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.818 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.821 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.823 225859 INFO os_vif [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:b2:cf,bridge_name='br-int',has_traffic_filtering=True,id=f4f25f14-bc59-4322-86b2-b48f096472a5,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4f25f14-bc')#033[00m
Jan 20 09:48:26 np0005588919 podman[263737]: 2026-01-20 14:48:26.847305962 +0000 UTC m=+0.052385030 container remove ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.853 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c998381c-5e43-4113-aeb1-2803f6ed3d49]: (4, ('Tue Jan 20 02:48:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8)\nec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8\nTue Jan 20 02:48:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8)\nec3d4d4109b3c2649aa7a2cc822b599f8cdcf1d31e564f6ab678ff971e9f60b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.855 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1d83cf61-6116-4209-89e3-5e35cff9f185]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.857 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.875 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 nova_compute[225855]: 2026-01-20 14:48:26.875 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.887 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41f7cec2-df87-48ab-acbe-2fa62a3df980]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.900 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[184b8f32-d054-46d0-bfbe-7ecd126c56dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.901 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7693357f-b52c-4346-bf8b-431266a6988e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.920 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7b587cec-c283-4971-91dc-3cc50fec07a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554155, 'reachable_time': 23416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263768, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.923 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:48:26 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:48:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:26.923 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[a01d017a-22bb-4d52-b89f-a95c3e9aeca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.104 225859 DEBUG nova.compute.manager [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.104 225859 DEBUG oslo_concurrency.lockutils [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.104 225859 DEBUG oslo_concurrency.lockutils [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.105 225859 DEBUG oslo_concurrency.lockutils [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.105 225859 DEBUG nova.compute.manager [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.105 225859 DEBUG nova.compute.manager [req-4cb968c4-f4bb-4a37-99b9-dca7fec25782 req-b4874409-7297-4269-95a6-eb91358681fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-unplugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.175 225859 INFO nova.virt.libvirt.driver [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deleting instance files /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916_del#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.175 225859 INFO nova.virt.libvirt.driver [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deletion of /var/lib/nova/instances/d699cf6a-9c33-400b-8d0f-4d61b8b16916_del complete#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.286 225859 INFO nova.compute.manager [None req-447e4e19-2a81-4c40-b517-6a53dcbf479f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Get console output#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.287 225859 INFO nova.compute.manager [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Took 0.74 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.287 225859 DEBUG oslo.service.loopingcall [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.287 225859 DEBUG nova.compute.manager [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.288 225859 DEBUG nova.network.neutron [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:48:27 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.297 225859 INFO oslo.privsep.daemon [None req-447e4e19-2a81-4c40-b517-6a53dcbf479f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpilubip9q/privsep.sock']#033[00m
Jan 20 09:48:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:27Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:48:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:27Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:66:1d 10.100.0.8
Jan 20 09:48:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:27.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.046 225859 INFO oslo.privsep.daemon [None req-447e4e19-2a81-4c40-b517-6a53dcbf479f 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.927 263775 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.931 263775 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.933 263775 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:27.933 263775 INFO oslo.privsep.daemon [-] privsep daemon running as pid 263775#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.137 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.201 225859 DEBUG nova.network.neutron [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.229 225859 INFO nova.compute.manager [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Took 0.94 seconds to deallocate network for instance.#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.303 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.303 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.313 225859 DEBUG nova.compute.manager [req-cf920533-f375-47fd-8e0a-7ef907abd58e req-3ec179e0-535d-4d00-a87f-8de22b99899f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-deleted-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.406 225859 DEBUG oslo_concurrency.processutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.695 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:28.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/679145408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.836 225859 DEBUG oslo_concurrency.processutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.845 225859 DEBUG nova.compute.provider_tree [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.875 225859 DEBUG nova.scheduler.client.report [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.898 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:28 np0005588919 nova_compute[225855]: 2026-01-20 14:48:28.949 225859 INFO nova.scheduler.client.report [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Deleted allocations for instance d699cf6a-9c33-400b-8d0f-4d61b8b16916#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.043 225859 DEBUG oslo_concurrency.lockutils [None req-90f3966b-c5d6-464f-b832-87927f85c1a1 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.310 225859 DEBUG nova.compute.manager [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.311 225859 DEBUG oslo_concurrency.lockutils [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.311 225859 DEBUG oslo_concurrency.lockutils [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.311 225859 DEBUG oslo_concurrency.lockutils [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d699cf6a-9c33-400b-8d0f-4d61b8b16916-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.311 225859 DEBUG nova.compute.manager [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] No waiting events found dispatching network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:29 np0005588919 nova_compute[225855]: 2026-01-20 14:48:29.311 225859 WARNING nova.compute.manager [req-685fdab1-6547-45ee-bcf6-d20d8c05769c req-d72213b2-47a5-41c8-b984-4440d79c2ea1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Received unexpected event network-vif-plugged-f4f25f14-bc59-4322-86b2-b48f096472a5 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:48:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:30 np0005588919 nova_compute[225855]: 2026-01-20 14:48:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:30 np0005588919 nova_compute[225855]: 2026-01-20 14:48:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:30 np0005588919 nova_compute[225855]: 2026-01-20 14:48:30.682 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:30.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:48:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:30 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.354 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.355 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:48:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:31.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.659 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.659 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.660 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.660 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:31 np0005588919 nova_compute[225855]: 2026-01-20 14:48:31.818 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:32.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:33 np0005588919 podman[263982]: 2026-01-20 14:48:33.036057608 +0000 UTC m=+0.076021437 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:48:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:33.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:34.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.827 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [{"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.854 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-6586bc3e-3a94-4d22-8e8c-713a86a956fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.855 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.855 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.856 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:34 np0005588919 nova_compute[225855]: 2026-01-20 14:48:34.856 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:48:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:35.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:35 np0005588919 nova_compute[225855]: 2026-01-20 14:48:35.684 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:35 np0005588919 nova_compute[225855]: 2026-01-20 14:48:35.881 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.372 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:48:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:36.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.820 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.852 225859 DEBUG nova.compute.manager [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.925 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.926 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.926 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.926 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.927 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.927 225859 INFO nova.compute.manager [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Terminating instance#033[00m
Jan 20 09:48:36 np0005588919 nova_compute[225855]: 2026-01-20 14:48:36.928 225859 DEBUG nova.compute.manager [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:48:37 np0005588919 kernel: tapefc8b363-e7 (unregistering): left promiscuous mode
Jan 20 09:48:37 np0005588919 NetworkManager[49104]: <info>  [1768920517.0215] device (tapefc8b363-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:37Z|00368|binding|INFO|Releasing lport efc8b363-e70d-42f6-9be8-99865e269ec9 from this chassis (sb_readonly=0)
Jan 20 09:48:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:37Z|00369|binding|INFO|Setting lport efc8b363-e70d-42f6-9be8-99865e269ec9 down in Southbound
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:37Z|00370|binding|INFO|Removing iface tapefc8b363-e7 ovn-installed in OVS
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.044 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:66:1d 10.100.0.8'], port_security=['fa:16:3e:36:66:1d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9beb3ec3-721e-4919-9713-a92c82ad189b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.213', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=efc8b363-e70d-42f6-9be8-99865e269ec9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.046 140354 INFO neutron.agent.ovn.metadata.agent [-] Port efc8b363-e70d-42f6-9be8-99865e269ec9 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.049 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.066 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[08886395-767d-44cb-949f-c0da714aba05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 20 09:48:37 np0005588919 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000065.scope: Consumed 13.781s CPU time.
Jan 20 09:48:37 np0005588919 systemd-machined[194361]: Machine qemu-42-instance-00000065 terminated.
Jan 20 09:48:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.096 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[fe03281a-0b82-479e-b1bf-4206dc14a1b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.100 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d85983c1-6229-446f-af39-5fbc92b9cb85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.125 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a6eaff-b802-44f5-89a9-5fc03a4d92ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.144 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9879718c-b53a-4063-b076-71aa27737dc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264072, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.149 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.149 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.161 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1a3948-823b-49d9-bc27-4ac163e47b0c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264077, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264077, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.162 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.167 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.168 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.168 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.169 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:37.169 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.172 225859 INFO nova.virt.libvirt.driver [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Instance destroyed successfully.#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.173 225859 DEBUG nova.objects.instance [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 9beb3ec3-721e-4919-9713-a92c82ad189b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.186 225859 DEBUG nova.objects.instance [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_requests' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.200 225859 DEBUG nova.virt.libvirt.vif [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-757916410',display_name='tempest-ServerActionsTestOtherA-server-757916410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-757916410',id=101,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-042ihw9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:48:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=9beb3ec3-721e-4919-9713-a92c82ad189b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.201 225859 DEBUG nova.network.os_vif_util [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "efc8b363-e70d-42f6-9be8-99865e269ec9", "address": "fa:16:3e:36:66:1d", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefc8b363-e7", "ovs_interfaceid": "efc8b363-e70d-42f6-9be8-99865e269ec9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.201 225859 DEBUG nova.network.os_vif_util [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.201 225859 DEBUG os_vif [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.203 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefc8b363-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.204 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.207 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.207 225859 INFO nova.compute.claims [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.208 225859 DEBUG nova.objects.instance [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.208 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.210 225859 INFO os_vif [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:66:1d,bridge_name='br-int',has_traffic_filtering=True,id=efc8b363-e70d-42f6-9be8-99865e269ec9,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefc8b363-e7')#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.228 225859 DEBUG nova.objects.instance [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.289 225859 INFO nova.compute.resource_tracker [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating resource usage from migration 37b266a8-8f13-40bc-ab16-470d7fe422ef#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.290 225859 DEBUG nova.compute.resource_tracker [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Starting to track incoming migration 37b266a8-8f13-40bc-ab16-470d7fe422ef with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.343 225859 DEBUG nova.compute.manager [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.343 225859 DEBUG oslo_concurrency.lockutils [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.343 225859 DEBUG oslo_concurrency.lockutils [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.343 225859 DEBUG oslo_concurrency.lockutils [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.344 225859 DEBUG nova.compute.manager [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.344 225859 DEBUG nova.compute.manager [req-597a8e63-0e57-4475-bd02-05abbca4aadd req-01335e5b-a54a-4f80-8eec-0476c9866a60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-unplugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:48:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.371 225859 DEBUG nova.scheduler.client.report [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.373 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.394 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.396 225859 DEBUG nova.scheduler.client.report [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.396 225859 DEBUG nova.compute.provider_tree [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.431 225859 DEBUG nova.scheduler.client.report [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.454 225859 DEBUG nova.scheduler.client.report [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:48:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:37.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.608 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.655 225859 INFO nova.virt.libvirt.driver [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Deleting instance files /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b_del#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.656 225859 INFO nova.virt.libvirt.driver [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Deletion of /var/lib/nova/instances/9beb3ec3-721e-4919-9713-a92c82ad189b_del complete#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.703 225859 INFO nova.compute.manager [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.704 225859 DEBUG oslo.service.loopingcall [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.704 225859 DEBUG nova.compute.manager [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:48:37 np0005588919 nova_compute[225855]: 2026-01-20 14:48:37.705 225859 DEBUG nova.network.neutron [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:48:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4259718524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.048 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.053 225859 DEBUG nova.compute.provider_tree [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.072 225859 DEBUG nova.scheduler.client.report [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.099 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.100 225859 INFO nova.compute.manager [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Migrating#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.105 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.105 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.105 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.106 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3717095713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.548 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.671 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.672 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000057 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.678 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.679 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:48:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:38.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.846 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.847 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4057MB free_disk=20.843402862548828GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.848 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.848 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.921 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Migration for instance 75736b87-b14e-45b7-b43b-5129cf7d3279 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.948 225859 INFO nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating resource usage from migration 37b266a8-8f13-40bc-ab16-470d7fe422ef#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.949 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Starting to track incoming migration 37b266a8-8f13-40bc-ab16-470d7fe422ef with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.987 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.987 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:48:38 np0005588919 nova_compute[225855]: 2026-01-20 14:48:38.988 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 9beb3ec3-721e-4919-9713-a92c82ad189b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.012 225859 WARNING nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 75736b87-b14e-45b7-b43b-5129cf7d3279 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.012 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.013 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.212 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.491 225859 DEBUG nova.compute.manager [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.492 225859 DEBUG oslo_concurrency.lockutils [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.492 225859 DEBUG oslo_concurrency.lockutils [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.492 225859 DEBUG oslo_concurrency.lockutils [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.493 225859 DEBUG nova.compute.manager [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] No waiting events found dispatching network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.493 225859 WARNING nova.compute.manager [req-45654040-a3ab-4549-b68a-4aca467f45c4 req-b0e17b8a-4715-4e4b-a593-f613fdd388d3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received unexpected event network-vif-plugged-efc8b363-e70d-42f6-9be8-99865e269ec9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:48:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:39.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/946020620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.625 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.630 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.652 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.690 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.691 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.703 225859 DEBUG nova.network.neutron [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:39 np0005588919 nova_compute[225855]: 2026-01-20 14:48:39.749 225859 INFO nova.compute.manager [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Took 2.04 seconds to deallocate network for instance.#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.007 225859 DEBUG nova.compute.manager [req-1d5cab74-782c-4f3e-a367-363132bb2350 req-9498e3af-b963-476a-aac7-f1e4bb3eada9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Received event network-vif-deleted-efc8b363-e70d-42f6-9be8-99865e269ec9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.106 225859 INFO nova.compute.manager [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Took 0.36 seconds to detach 1 volumes for instance.#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.108 225859 DEBUG nova.compute.manager [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Deleting volume: 9219aafd-6c66-4f38-9927-85b54b4175ae _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.399 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.400 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.523 225859 DEBUG oslo_concurrency.processutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.657 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.658 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.658 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.689 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:40.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4024521406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.945 225859 DEBUG oslo_concurrency.processutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.950 225859 DEBUG nova.compute.provider_tree [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:40 np0005588919 nova_compute[225855]: 2026-01-20 14:48:40.974 225859 DEBUG nova.scheduler.client.report [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.001 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.050 225859 INFO nova.scheduler.client.report [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocations for instance 9beb3ec3-721e-4919-9713-a92c82ad189b#033[00m
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.119 225859 DEBUG oslo_concurrency.lockutils [None req-ef76f3be-b647-42f5-a3c1-090076f98780 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "9beb3ec3-721e-4919-9713-a92c82ad189b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:41 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:48:41 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:48:41 np0005588919 systemd-logind[783]: New session 60 of user nova.
Jan 20 09:48:41 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:48:41 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:48:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:41.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:41 np0005588919 systemd[264200]: Queued start job for default target Main User Target.
Jan 20 09:48:41 np0005588919 systemd[264200]: Created slice User Application Slice.
Jan 20 09:48:41 np0005588919 systemd[264200]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:48:41 np0005588919 systemd[264200]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:48:41 np0005588919 systemd[264200]: Reached target Paths.
Jan 20 09:48:41 np0005588919 systemd[264200]: Reached target Timers.
Jan 20 09:48:41 np0005588919 systemd[264200]: Starting D-Bus User Message Bus Socket...
Jan 20 09:48:41 np0005588919 systemd[264200]: Starting Create User's Volatile Files and Directories...
Jan 20 09:48:41 np0005588919 systemd[264200]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:48:41 np0005588919 systemd[264200]: Reached target Sockets.
Jan 20 09:48:41 np0005588919 systemd[264200]: Finished Create User's Volatile Files and Directories.
Jan 20 09:48:41 np0005588919 systemd[264200]: Reached target Basic System.
Jan 20 09:48:41 np0005588919 systemd[264200]: Reached target Main User Target.
Jan 20 09:48:41 np0005588919 systemd[264200]: Startup finished in 168ms.
Jan 20 09:48:41 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:48:41 np0005588919 systemd[1]: Started Session 60 of User nova.
Jan 20 09:48:41 np0005588919 systemd[1]: session-60.scope: Deactivated successfully.
Jan 20 09:48:41 np0005588919 systemd-logind[783]: Session 60 logged out. Waiting for processes to exit.
Jan 20 09:48:41 np0005588919 systemd-logind[783]: Removed session 60.
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.779 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920506.7775161, d699cf6a-9c33-400b-8d0f-4d61b8b16916 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.779 225859 INFO nova.compute.manager [-] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:41 np0005588919 nova_compute[225855]: 2026-01-20 14:48:41.824 225859 DEBUG nova.compute.manager [None req-2471905c-4f04-4a5c-9447-dfb915c4b9fc - - - - - -] [instance: d699cf6a-9c33-400b-8d0f-4d61b8b16916] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:41 np0005588919 systemd-logind[783]: New session 62 of user nova.
Jan 20 09:48:41 np0005588919 systemd[1]: Started Session 62 of User nova.
Jan 20 09:48:41 np0005588919 systemd[1]: session-62.scope: Deactivated successfully.
Jan 20 09:48:41 np0005588919 systemd-logind[783]: Session 62 logged out. Waiting for processes to exit.
Jan 20 09:48:41 np0005588919 systemd-logind[783]: Removed session 62.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.159546) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522159611, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 611, "num_deletes": 251, "total_data_size": 885578, "memory_usage": 897424, "flush_reason": "Manual Compaction"}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522165781, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 572733, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43805, "largest_seqno": 44411, "table_properties": {"data_size": 569651, "index_size": 990, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7616, "raw_average_key_size": 19, "raw_value_size": 563367, "raw_average_value_size": 1444, "num_data_blocks": 43, "num_entries": 390, "num_filter_entries": 390, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920495, "oldest_key_time": 1768920495, "file_creation_time": 1768920522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 6276 microseconds, and 2499 cpu microseconds.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.165820) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 572733 bytes OK
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.165839) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.167771) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.167792) EVENT_LOG_v1 {"time_micros": 1768920522167787, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.167811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 882106, prev total WAL file size 882106, number of live WAL files 2.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.168321) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(559KB)], [84(10038KB)]
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522168352, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 10852386, "oldest_snapshot_seqno": -1}
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.205 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6690 keys, 8960919 bytes, temperature: kUnknown
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522221176, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 8960919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8918265, "index_size": 24814, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 173883, "raw_average_key_size": 25, "raw_value_size": 8800557, "raw_average_value_size": 1315, "num_data_blocks": 976, "num_entries": 6690, "num_filter_entries": 6690, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.221402) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 8960919 bytes
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.250721) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.1 rd, 169.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 9.8 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(34.6) write-amplify(15.6) OK, records in: 7204, records dropped: 514 output_compression: NoCompression
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.250764) EVENT_LOG_v1 {"time_micros": 1768920522250748, "job": 52, "event": "compaction_finished", "compaction_time_micros": 52902, "compaction_time_cpu_micros": 19660, "output_level": 6, "num_output_files": 1, "total_output_size": 8960919, "num_input_records": 7204, "num_output_records": 6690, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522251133, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920522252710, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.168259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:48:42.252752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:48:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.634 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.635 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.635 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.635 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.635 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.637 225859 INFO nova.compute.manager [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Terminating instance#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.638 225859 DEBUG nova.compute.manager [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:48:42 np0005588919 kernel: tapb93181ae-8a (unregistering): left promiscuous mode
Jan 20 09:48:42 np0005588919 NetworkManager[49104]: <info>  [1768920522.6982] device (tapb93181ae-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.712 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:42Z|00371|binding|INFO|Releasing lport b93181ae-8a01-468c-adfc-ec8894512d2e from this chassis (sb_readonly=0)
Jan 20 09:48:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:42Z|00372|binding|INFO|Setting lport b93181ae-8a01-468c-adfc-ec8894512d2e down in Southbound
Jan 20 09:48:42 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:42Z|00373|binding|INFO|Removing iface tapb93181ae-8a ovn-installed in OVS
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.715 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.720 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:63:44 10.100.0.5'], port_security=['fa:16:3e:cf:63:44 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7efaa6b8-d1bd-4954-83ec-adcdb8e392bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=b93181ae-8a01-468c-adfc-ec8894512d2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.722 140354 INFO neutron.agent.ovn.metadata.agent [-] Port b93181ae-8a01-468c-adfc-ec8894512d2e in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.726 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a19e9d1a-864f-41ee-bdea-188e65973ea5#033[00m
Jan 20 09:48:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:42.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.740 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.747 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7895391a-2856-4666-9ca9-8671651e37f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000064.scope: Deactivated successfully.
Jan 20 09:48:42 np0005588919 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000064.scope: Consumed 16.119s CPU time.
Jan 20 09:48:42 np0005588919 systemd-machined[194361]: Machine qemu-40-instance-00000064 terminated.
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.776 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa3f0aa-dc68-44c0-9be4-a0c19846e6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.781 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6357f34f-283b-4e3a-b678-57c12f757e4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.809 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[35fbeb86-0504-447b-bb13-47dd36b80b0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.827 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1f1aff-c0bb-407a-86b0-b0b014d31e5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa19e9d1a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:53:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529090, 'reachable_time': 16021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264236, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.843 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5541d1-d1ca-4a29-8e26-03529b409b90]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529100, 'tstamp': 529100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264237, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa19e9d1a-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529104, 'tstamp': 529104}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264237, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.845 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.847 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.851 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.851 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19e9d1a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.852 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.852 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa19e9d1a-80, col_values=(('external_ids', {'iface-id': '5527ab8d-a985-420b-9d5b-7e5d9baf7004'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:42.852 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.875 225859 INFO nova.virt.libvirt.driver [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Instance destroyed successfully.#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.875 225859 DEBUG nova.objects.instance [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.897 225859 DEBUG nova.virt.libvirt.vif [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:47:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1146964335',display_name='tempest-ServerActionsTestOtherA-server-1146964335',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1146964335',id=100,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:47:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-pmljdz8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:47:13Z,user_data=None,user_id='869086208e10436c9dc96c78bee9a85d',uuid=7efaa6b8-d1bd-4954-83ec-adcdb8e392bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.897 225859 DEBUG nova.network.os_vif_util [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "b93181ae-8a01-468c-adfc-ec8894512d2e", "address": "fa:16:3e:cf:63:44", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93181ae-8a", "ovs_interfaceid": "b93181ae-8a01-468c-adfc-ec8894512d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.898 225859 DEBUG nova.network.os_vif_util [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.898 225859 DEBUG os_vif [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.900 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.900 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb93181ae-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.902 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.903 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.906 225859 INFO os_vif [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:63:44,bridge_name='br-int',has_traffic_filtering=True,id=b93181ae-8a01-468c-adfc-ec8894512d2e,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb93181ae-8a')#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.975 225859 DEBUG nova.compute.manager [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-unplugged-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.975 225859 DEBUG oslo_concurrency.lockutils [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.975 225859 DEBUG oslo_concurrency.lockutils [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.976 225859 DEBUG oslo_concurrency.lockutils [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.976 225859 DEBUG nova.compute.manager [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] No waiting events found dispatching network-vif-unplugged-b93181ae-8a01-468c-adfc-ec8894512d2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:42 np0005588919 nova_compute[225855]: 2026-01-20 14:48:42.976 225859 DEBUG nova.compute.manager [req-cde4c0fd-4fe7-4fb1-9cf5-7ed3c872c46d req-fe7fa7a2-bef1-4986-b2e3-1901ee724735 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-unplugged-b93181ae-8a01-468c-adfc-ec8894512d2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.274 225859 INFO nova.virt.libvirt.driver [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Deleting instance files /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_del#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.275 225859 INFO nova.virt.libvirt.driver [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Deletion of /var/lib/nova/instances/7efaa6b8-d1bd-4954-83ec-adcdb8e392bf_del complete#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.353 225859 INFO nova.compute.manager [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.353 225859 DEBUG oslo.service.loopingcall [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.353 225859 DEBUG nova.compute.manager [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:48:43 np0005588919 nova_compute[225855]: 2026-01-20 14:48:43.353 225859 DEBUG nova.network.neutron [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:48:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:43.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:44 np0005588919 podman[264270]: 2026-01-20 14:48:44.030973978 +0000 UTC m=+0.064931215 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:48:44 np0005588919 nova_compute[225855]: 2026-01-20 14:48:44.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:44.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.043 225859 DEBUG nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.043 225859 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.043 225859 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.043 225859 DEBUG oslo_concurrency.lockutils [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.044 225859 DEBUG nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.044 225859 WARNING nova.compute.manager [req-2e1deb02-3da8-4fe0-a4bc-1d10c3f9ec01 req-61da54de-835e-4ae7-8081-4457a5260c2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.069 225859 DEBUG nova.compute.manager [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.070 225859 DEBUG oslo_concurrency.lockutils [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.070 225859 DEBUG oslo_concurrency.lockutils [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.070 225859 DEBUG oslo_concurrency.lockutils [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.070 225859 DEBUG nova.compute.manager [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] No waiting events found dispatching network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.070 225859 WARNING nova.compute.manager [req-6197acc9-1c76-4bc6-8961-3bf11bcf4ba1 req-a13cf42a-2b94-42bc-a876-3aa115131942 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received unexpected event network-vif-plugged-b93181ae-8a01-468c-adfc-ec8894512d2e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.465 225859 DEBUG nova.network.neutron [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.498 225859 INFO nova.compute.manager [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Took 2.14 seconds to deallocate network for instance.#033[00m
Jan 20 09:48:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:45.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.570 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.570 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.691 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.703 225859 DEBUG oslo_concurrency.processutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:45 np0005588919 nova_compute[225855]: 2026-01-20 14:48:45.929 225859 INFO nova.network.neutron [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.182 225859 DEBUG oslo_concurrency.processutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.189 225859 DEBUG nova.compute.provider_tree [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.213 225859 DEBUG nova.scheduler.client.report [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.242 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.298 225859 INFO nova.scheduler.client.report [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocations for instance 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf#033[00m
Jan 20 09:48:46 np0005588919 nova_compute[225855]: 2026-01-20 14:48:46.476 225859 DEBUG oslo_concurrency.lockutils [None req-dd3dec03-bb58-43fc-bfa9-07351fe03dc3 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "7efaa6b8-d1bd-4954-83ec-adcdb8e392bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:46.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.151 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.152 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.152 225859 DEBUG nova.network.neutron [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.291 225859 DEBUG nova.compute.manager [req-cda4b8e7-6845-4233-8241-8c88c481ce34 req-0d70c841-8e77-47f9-98b3-5d28834685d6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Received event network-vif-deleted-b93181ae-8a01-468c-adfc-ec8894512d2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.325 225859 DEBUG nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.326 225859 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.326 225859 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.327 225859 DEBUG oslo_concurrency.lockutils [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.327 225859 DEBUG nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.327 225859 WARNING nova.compute.manager [req-3a41699a-0367-4973-ac53-161ff8f9a117 req-226c3ec0-7a42-441c-bda7-ac430516d7d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:48:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.398 225859 DEBUG nova.compute.manager [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.399 225859 DEBUG nova.compute.manager [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing instance network info cache due to event network-changed-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.399 225859 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:48:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:47 np0005588919 nova_compute[225855]: 2026-01-20 14:48:47.901 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:48.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:48:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1043995877' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:48:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:48:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1043995877' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.003 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:48:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:49.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.646 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.647 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.647 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.648 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.648 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.649 225859 INFO nova.compute.manager [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Terminating instance#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.650 225859 DEBUG nova.compute.manager [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:48:49 np0005588919 kernel: tap2c289e6f-29 (unregistering): left promiscuous mode
Jan 20 09:48:49 np0005588919 NetworkManager[49104]: <info>  [1768920529.7127] device (tap2c289e6f-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:48:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:49Z|00374|binding|INFO|Releasing lport 2c289e6f-295e-44c3-948a-9a6901251890 from this chassis (sb_readonly=0)
Jan 20 09:48:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:49Z|00375|binding|INFO|Setting lport 2c289e6f-295e-44c3-948a-9a6901251890 down in Southbound
Jan 20 09:48:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:49Z|00376|binding|INFO|Removing iface tap2c289e6f-29 ovn-installed in OVS
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.748 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 20 09:48:49 np0005588919 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Consumed 24.485s CPU time.
Jan 20 09:48:49 np0005588919 systemd-machined[194361]: Machine qemu-36-instance-00000057 terminated.
Jan 20 09:48:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:49.769 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:4c:e2 10.100.0.9'], port_security=['fa:16:3e:2f:4c:e2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6586bc3e-3a94-4d22-8e8c-713a86a956fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b683fcc0026242e28ba6d8fba638688e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ceb05b5-53ff-444a-b0ef-2ba8294d585b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=361f9a69-30a6-4be4-89ad-2a8f92877af2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2c289e6f-295e-44c3-948a-9a6901251890) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:49.771 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2c289e6f-295e-44c3-948a-9a6901251890 in datapath a19e9d1a-864f-41ee-bdea-188e65973ea5 unbound from our chassis#033[00m
Jan 20 09:48:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:49.772 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a19e9d1a-864f-41ee-bdea-188e65973ea5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:48:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:49.773 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8eecead7-49ee-4b1f-a71f-f64143abdd8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:49.774 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 namespace which is not needed anymore#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.875 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.880 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.892 225859 INFO nova.virt.libvirt.driver [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Instance destroyed successfully.#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.893 225859 DEBUG nova.objects.instance [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lazy-loading 'resources' on Instance uuid 6586bc3e-3a94-4d22-8e8c-713a86a956fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:49 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [NOTICE]   (258968) : haproxy version is 2.8.14-c23fe91
Jan 20 09:48:49 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [NOTICE]   (258968) : path to executable is /usr/sbin/haproxy
Jan 20 09:48:49 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [WARNING]  (258968) : Exiting Master process...
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.916 225859 DEBUG nova.virt.libvirt.vif [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:44:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1533521351',display_name='tempest-ServerActionsTestOtherA-server-1533521351',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1533521351',id=87,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDOC6AOV9rhSIyyfBXYGEFvhdWE5GLRuNfs0sPvnXoLHIstQY2OpqwhI35UcFW1s96uqz0+j9sMbdcq/NuNcfgrlnXkEz6j2iO7WUECWdPrQW34JB1FTWAvtA4R6RDoaZA==',key_name='tempest-keypair-1611966828',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:44:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b683fcc0026242e28ba6d8fba638688e',ramdisk_id='',reservation_id='r-47bmn591',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-967087071',owner_user_name='tempest-ServerActionsTestOtherA-967087071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:44:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='869086208e10436c9dc96c78bee9a85d',uuid=6586bc3e-3a94-4d22-8e8c-713a86a956fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.917 225859 DEBUG nova.network.os_vif_util [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converting VIF {"id": "2c289e6f-295e-44c3-948a-9a6901251890", "address": "fa:16:3e:2f:4c:e2", "network": {"id": "a19e9d1a-864f-41ee-bdea-188e65973ea5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-916311998-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b683fcc0026242e28ba6d8fba638688e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c289e6f-29", "ovs_interfaceid": "2c289e6f-295e-44c3-948a-9a6901251890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:49 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [ALERT]    (258968) : Current worker (258970) exited with code 143 (Terminated)
Jan 20 09:48:49 np0005588919 neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5[258963]: [WARNING]  (258968) : All workers exited. Exiting... (0)
Jan 20 09:48:49 np0005588919 systemd[1]: libpod-c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8.scope: Deactivated successfully.
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.917 225859 DEBUG nova.network.os_vif_util [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.918 225859 DEBUG os_vif [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.920 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.920 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c289e6f-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.922 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.924 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:49 np0005588919 podman[264388]: 2026-01-20 14:48:49.925199748 +0000 UTC m=+0.051361620 container died c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:48:49 np0005588919 nova_compute[225855]: 2026-01-20 14:48:49.926 225859 INFO os_vif [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:4c:e2,bridge_name='br-int',has_traffic_filtering=True,id=2c289e6f-295e-44c3-948a-9a6901251890,network=Network(a19e9d1a-864f-41ee-bdea-188e65973ea5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c289e6f-29')#033[00m
Jan 20 09:48:49 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8-userdata-shm.mount: Deactivated successfully.
Jan 20 09:48:49 np0005588919 systemd[1]: var-lib-containers-storage-overlay-b6713c84d9e47876707c1459896eab206f8324b8157aad0c912932cf9613e3a8-merged.mount: Deactivated successfully.
Jan 20 09:48:49 np0005588919 podman[264388]: 2026-01-20 14:48:49.984194144 +0000 UTC m=+0.110356016 container cleanup c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:48:49 np0005588919 systemd[1]: libpod-conmon-c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8.scope: Deactivated successfully.
Jan 20 09:48:50 np0005588919 podman[264444]: 2026-01-20 14:48:50.047281625 +0000 UTC m=+0.041997797 container remove c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.052 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[be14569f-29cb-4ac0-8dc4-586302af2c5e]: (4, ('Tue Jan 20 02:48:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8)\nc79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8\nTue Jan 20 02:48:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 (c79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8)\nc79270ac4f85aee516fb775db6a3d75d0fa33264a9044c47814a555bf5216ad8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.054 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8dcfb51-f2cb-42b2-9162-1807220be49a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.055 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19e9d1a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:50 np0005588919 kernel: tapa19e9d1a-80: left promiscuous mode
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.074 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8d162a-54fa-4431-acb6-232fb80df2ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.090 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9c24a6bf-dd8d-4a2f-b734-5f3f5ded4394]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.091 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7c45b1-0979-4e2d-95d6-0f2b83ae1ae4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[253bfee5-e4de-47da-92ca-26ad48c91a56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529083, 'reachable_time': 37181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264459, 'error': None, 'target': 'ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.107 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a19e9d1a-864f-41ee-bdea-188e65973ea5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:48:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:50.107 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[1140f12b-3011-46e4-ac39-17585bfbc955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:50 np0005588919 systemd[1]: run-netns-ovnmeta\x2da19e9d1a\x2d864f\x2d41ee\x2dbdea\x2d188e65973ea5.mount: Deactivated successfully.
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.149 225859 DEBUG nova.compute.manager [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-unplugged-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.149 225859 DEBUG oslo_concurrency.lockutils [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.149 225859 DEBUG oslo_concurrency.lockutils [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.149 225859 DEBUG oslo_concurrency.lockutils [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.150 225859 DEBUG nova.compute.manager [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] No waiting events found dispatching network-vif-unplugged-2c289e6f-295e-44c3-948a-9a6901251890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.150 225859 DEBUG nova.compute.manager [req-69bd00ab-0273-4c70-8bf8-2334e1571b10 req-b1223372-3acc-4082-8aea-32a9527d38f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-unplugged-2c289e6f-295e-44c3-948a-9a6901251890 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.187 225859 DEBUG nova.network.neutron [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.225 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.228 225859 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.228 225859 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Refreshing network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.393 225859 INFO nova.virt.libvirt.driver [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Deleting instance files /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb_del#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.394 225859 INFO nova.virt.libvirt.driver [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Deletion of /var/lib/nova/instances/6586bc3e-3a94-4d22-8e8c-713a86a956fb_del complete#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.419 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.422 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.422 225859 INFO nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Creating image(s)#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.461 225859 DEBUG nova.storage.rbd_utils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] creating snapshot(nova-resize) on rbd image(75736b87-b14e-45b7-b43b-5129cf7d3279_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.497 225859 INFO nova.compute.manager [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.498 225859 DEBUG oslo.service.loopingcall [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.498 225859 DEBUG nova.compute.manager [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.498 225859 DEBUG nova.network.neutron [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:48:50 np0005588919 nova_compute[225855]: 2026-01-20 14:48:50.693 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:50.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.535 225859 DEBUG nova.network.neutron [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.576 225859 INFO nova.compute.manager [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Took 1.08 seconds to deallocate network for instance.#033[00m
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.657 225859 DEBUG nova.compute.manager [req-1385451d-2826-4c94-a2ef-20f16e39ab4d req-da878352-bd74-4018-9da6-51a9356ea8b6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-deleted-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.710 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.711 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.835 225859 DEBUG oslo_concurrency.processutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:51 np0005588919 nova_compute[225855]: 2026-01-20 14:48:51.908 225859 DEBUG nova.objects.instance [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.053 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.054 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Ensure instance console log exists: /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.054 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.055 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.055 225859 DEBUG oslo_concurrency.lockutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.057 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start _get_guest_xml network_info=[{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.062 225859 WARNING nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.069 225859 DEBUG nova.virt.libvirt.host [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.069 225859 DEBUG nova.virt.libvirt.host [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.072 225859 DEBUG nova.virt.libvirt.host [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.073 225859 DEBUG nova.virt.libvirt.host [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.074 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.074 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.074 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.075 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.075 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.075 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.075 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.075 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.076 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.076 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.076 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.076 225859 DEBUG nova.virt.hardware [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.076 225859 DEBUG nova.objects.instance [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.109 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.171 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920517.170466, 9beb3ec3-721e-4919-9713-a92c82ad189b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.172 225859 INFO nova.compute.manager [-] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:52 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:48:52 np0005588919 systemd[264200]: Activating special unit Exit the Session...
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped target Main User Target.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped target Basic System.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped target Paths.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped target Sockets.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped target Timers.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:48:52 np0005588919 systemd[264200]: Closed D-Bus User Message Bus Socket.
Jan 20 09:48:52 np0005588919 systemd[264200]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:48:52 np0005588919 systemd[264200]: Removed slice User Application Slice.
Jan 20 09:48:52 np0005588919 systemd[264200]: Reached target Shutdown.
Jan 20 09:48:52 np0005588919 systemd[264200]: Finished Exit the Session.
Jan 20 09:48:52 np0005588919 systemd[264200]: Reached target Exit the Session.
Jan 20 09:48:52 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:48:52 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.197 225859 DEBUG nova.compute.manager [None req-ba82be76-e88e-45e8-88fd-add7eef3e220 - - - - - -] [instance: 9beb3ec3-721e-4919-9713-a92c82ad189b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:52 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:48:52 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:48:52 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:48:52 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:48:52 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:48:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:48:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1726552496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.308 225859 DEBUG oslo_concurrency.processutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.314 225859 DEBUG nova.compute.provider_tree [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.327 225859 DEBUG nova.scheduler.client.report [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.338 225859 DEBUG nova.compute.manager [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.339 225859 DEBUG oslo_concurrency.lockutils [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.339 225859 DEBUG oslo_concurrency.lockutils [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.339 225859 DEBUG oslo_concurrency.lockutils [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.339 225859 DEBUG nova.compute.manager [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] No waiting events found dispatching network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.340 225859 WARNING nova.compute.manager [req-bb9a0090-f25b-4de4-99f0-c9ce9ec45b1e req-3357901e-40a1-4923-a73a-8561f5e40d45 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Received unexpected event network-vif-plugged-2c289e6f-295e-44c3-948a-9a6901251890 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:48:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.353 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.427 225859 INFO nova.scheduler.client.report [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Deleted allocations for instance 6586bc3e-3a94-4d22-8e8c-713a86a956fb#033[00m
Jan 20 09:48:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:48:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1290154729' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.590 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.626 225859 DEBUG oslo_concurrency.lockutils [None req-76dd6255-29d1-4c9b-ac8c-fcebbd3c6878 869086208e10436c9dc96c78bee9a85d b683fcc0026242e28ba6d8fba638688e - - default default] Lock "6586bc3e-3a94-4d22-8e8c-713a86a956fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:52 np0005588919 nova_compute[225855]: 2026-01-20 14:48:52.631 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:48:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:52.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:48:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/344938960' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.098 225859 DEBUG oslo_concurrency.processutils [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.101 225859 DEBUG nova.virt.libvirt.vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:48:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.102 225859 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.103 225859 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.106 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <uuid>75736b87-b14e-45b7-b43b-5129cf7d3279</uuid>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <name>instance-0000005e</name>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestJSON-server-1202945337</nova:name>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:48:52</nova:creationTime>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <nova:port uuid="d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="serial">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="uuid">75736b87-b14e-45b7-b43b-5129cf7d3279</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/75736b87-b14e-45b7-b43b-5129cf7d3279_disk.config">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:22:f9:d2"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <target dev="tapd3a9a684-c9"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279/console.log" append="off"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:48:53 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:48:53 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:48:53 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:48:53 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.108 225859 DEBUG nova.virt.libvirt.vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:45:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:48:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.109 225859 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:22:f9:d2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.110 225859 DEBUG nova.network.os_vif_util [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.110 225859 DEBUG os_vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.111 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.112 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.112 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.115 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a9a684-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.115 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a9a684-c9, col_values=(('external_ids', {'iface-id': 'd3a9a684-c9a7-4abc-a085-9dcd17bfc2e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:f9:d2', 'vm-uuid': '75736b87-b14e-45b7-b43b-5129cf7d3279'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.1178] manager: (tapd3a9a684-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.180 225859 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updated VIF entry in instance network info cache for port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.180 225859 DEBUG nova.network.neutron [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [{"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.182 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.183 225859 INFO os_vif [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.196 225859 DEBUG oslo_concurrency.lockutils [req-c7e8339e-e267-4fef-95d8-eea084a60576 req-e982c00d-bc4f-4669-a1f6-4f7f2fd55d8b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-75736b87-b14e-45b7-b43b-5129cf7d3279" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.248 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.249 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.249 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:22:f9:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.249 225859 INFO nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Using config drive#033[00m
Jan 20 09:48:53 np0005588919 kernel: tapd3a9a684-c9: entered promiscuous mode
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.3252] manager: (tapd3a9a684-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.326 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:53Z|00377|binding|INFO|Claiming lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for this chassis.
Jan 20 09:48:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:53Z|00378|binding|INFO|d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6: Claiming fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.336 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '12', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.338 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.339 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:48:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:53Z|00379|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 ovn-installed in OVS
Jan 20 09:48:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:53Z|00380|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 up in Southbound
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.347 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.350 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[85587314-e3e5-4fe6-b980-2f992728d917]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.350 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.352 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.352 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9820f3-a4e5-43aa-a5cd-18d186d3046b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.353 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1e8f0a-d786-4756-8f01-3eca05564d2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 systemd-udevd[264653]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:48:53 np0005588919 systemd-machined[194361]: New machine qemu-44-instance-0000005e.
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.365 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[f086f5a8-bca5-4ae6-af14-7f8ff78468f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.3714] device (tapd3a9a684-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.3720] device (tapd3a9a684-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:48:53 np0005588919 systemd[1]: Started Virtual Machine qemu-44-instance-0000005e.
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.390 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29a55f88-66b2-470f-8a76-4d5760654fd2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.414 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b03a9a9d-d809-4bc2-a269-2bffe02325a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.4218] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.421 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6bc1bd-18cf-4c85-9540-1cb694842b78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.453 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b0b88f-5b25-4d02-b7bb-c6b7e5642631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.456 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8df730-fb36-431e-a23b-4f5acc9602dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.4812] device (tap762e1859-40): carrier: link connected
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.486 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[efe25fcc-4276-469e-ae13-688f61a23cc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.502 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4ed959-b2cb-4dff-a372-f1de6e8fb613]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557240, 'reachable_time': 19112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264685, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.518 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99ce43c4-dc47-47ee-a100-96db6a694876]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557240, 'tstamp': 557240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264686, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.533 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[286dd4e1-bd5c-4f8b-ad20-ad07d0e0c8fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557240, 'reachable_time': 19112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264687, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.563 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbb038a-7466-4d41-acb2-5c39186d686a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:53.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.628 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5c950255-6f08-4197-85ad-2fabeb08736c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.629 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.630 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.630 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 NetworkManager[49104]: <info>  [1768920533.6327] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Jan 20 09:48:53 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.631 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.635 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:48:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:48:53Z|00381|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.656 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.658 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.658 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.659 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2b9250-70ba-4b87-a0b3-77b763fc0fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.660 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:48:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:48:53.660 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.868 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920533.8684597, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.870 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.871 225859 DEBUG nova.compute.manager [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.875 225859 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance running successfully.#033[00m
Jan 20 09:48:53 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.877 225859 DEBUG nova.virt.libvirt.guest [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.877 225859 DEBUG nova.virt.libvirt.driver [None req-4796ccce-2601-44d6-9482-2234d4c7191f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.901 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.910 225859 DEBUG nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.910 225859 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.911 225859 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.911 225859 DEBUG oslo_concurrency.lockutils [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.911 225859 DEBUG nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.911 225859 WARNING nova.compute.manager [req-2bff57d0-bcb0-4eb7-8fce-7079b335a84b req-b0d9ae89-0168-4e69-9785-4e65cbc3b1f8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.914 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.981 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.982 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920533.869505, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:53 np0005588919 nova_compute[225855]: 2026-01-20 14:48:53.982 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Started (Lifecycle Event)#033[00m
Jan 20 09:48:54 np0005588919 podman[264763]: 2026-01-20 14:48:54.028620042 +0000 UTC m=+0.049923350 container create ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:48:54 np0005588919 nova_compute[225855]: 2026-01-20 14:48:54.031 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:54 np0005588919 nova_compute[225855]: 2026-01-20 14:48:54.038 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:48:54 np0005588919 systemd[1]: Started libpod-conmon-ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03.scope.
Jan 20 09:48:54 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:48:54 np0005588919 podman[264763]: 2026-01-20 14:48:54.00198228 +0000 UTC m=+0.023285608 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:48:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ffadcb61cfffe61ed679266c9eaf04220f431cb7218fbabad4d52c6fc4d512/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:48:54 np0005588919 podman[264763]: 2026-01-20 14:48:54.11637858 +0000 UTC m=+0.137681908 container init ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:48:54 np0005588919 podman[264763]: 2026-01-20 14:48:54.121487704 +0000 UTC m=+0.142791012 container start ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:48:54 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [NOTICE]   (264782) : New worker (264784) forked
Jan 20 09:48:54 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [NOTICE]   (264782) : Loading success.
Jan 20 09:48:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:54.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:55.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:55 np0005588919 nova_compute[225855]: 2026-01-20 14:48:55.696 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.305 225859 DEBUG nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.306 225859 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.306 225859 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.306 225859 DEBUG oslo_concurrency.lockutils [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.306 225859 DEBUG nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:48:56 np0005588919 nova_compute[225855]: 2026-01-20 14:48:56.306 225859 WARNING nova.compute.manager [req-1defb927-5b7a-454c-bb4b-437cc9b76850 req-fc38b063-5446-4aac-9ebb-fafe81e2b080 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:48:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:56.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:48:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:48:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:57.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:48:57 np0005588919 nova_compute[225855]: 2026-01-20 14:48:57.873 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920522.8722076, 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:48:57 np0005588919 nova_compute[225855]: 2026-01-20 14:48:57.874 225859 INFO nova.compute.manager [-] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:48:57 np0005588919 nova_compute[225855]: 2026-01-20 14:48:57.906 225859 DEBUG nova.compute.manager [None req-c036993b-9bf6-4549-91a7-0cbadab63652 - - - - - -] [instance: 7efaa6b8-d1bd-4954-83ec-adcdb8e392bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:48:58 np0005588919 nova_compute[225855]: 2026-01-20 14:48:58.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:48:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:48:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:48:58.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:48:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:48:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:48:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:48:59.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:00 np0005588919 nova_compute[225855]: 2026-01-20 14:49:00.729 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:49:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:00.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:49:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 20 09:49:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:01.237 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:01 np0005588919 nova_compute[225855]: 2026-01-20 14:49:01.237 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:01.238 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:49:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:01.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:01Z|00382|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:49:01 np0005588919 nova_compute[225855]: 2026-01-20 14:49:01.794 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:02.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:03 np0005588919 nova_compute[225855]: 2026-01-20 14:49:03.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:03.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:04 np0005588919 podman[264798]: 2026-01-20 14:49:04.127261559 +0000 UTC m=+0.153636258 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:49:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:04.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:04 np0005588919 nova_compute[225855]: 2026-01-20 14:49:04.889 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920529.8875377, 6586bc3e-3a94-4d22-8e8c-713a86a956fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:04 np0005588919 nova_compute[225855]: 2026-01-20 14:49:04.889 225859 INFO nova.compute.manager [-] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:49:04 np0005588919 nova_compute[225855]: 2026-01-20 14:49:04.917 225859 DEBUG nova.compute.manager [None req-79affb54-6c9e-4259-9844-0b9906e85fc1 - - - - - -] [instance: 6586bc3e-3a94-4d22-8e8c-713a86a956fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:05.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:05 np0005588919 nova_compute[225855]: 2026-01-20 14:49:05.730 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:06.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:07Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:f9:d2 10.100.0.4
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.331 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.332 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.332 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.333 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.333 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.334 225859 INFO nova.compute.manager [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Terminating instance#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.335 225859 DEBUG nova.compute.manager [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:49:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:07 np0005588919 kernel: tapd3a9a684-c9 (unregistering): left promiscuous mode
Jan 20 09:49:07 np0005588919 NetworkManager[49104]: <info>  [1768920547.3914] device (tapd3a9a684-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:49:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:07Z|00383|binding|INFO|Releasing lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 from this chassis (sb_readonly=0)
Jan 20 09:49:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:07Z|00384|binding|INFO|Setting lport d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 down in Southbound
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:07Z|00385|binding|INFO|Removing iface tapd3a9a684-c9 ovn-installed in OVS
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.418 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.433 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 20 09:49:07 np0005588919 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005e.scope: Consumed 12.920s CPU time.
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.464 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:f9:d2 10.100.0.4'], port_security=['fa:16:3e:22:f9:d2 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '75736b87-b14e-45b7-b43b-5129cf7d3279', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '14', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:07 np0005588919 systemd-machined[194361]: Machine qemu-44-instance-0000005e terminated.
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.466 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.468 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.469 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8c43fcaa-ad2c-424f-806d-7e9819bd6304]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.470 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.578 225859 INFO nova.virt.libvirt.driver [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Instance destroyed successfully.#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.579 225859 DEBUG nova.objects.instance [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid 75736b87-b14e-45b7-b43b-5129cf7d3279 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:07.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:07 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [NOTICE]   (264782) : haproxy version is 2.8.14-c23fe91
Jan 20 09:49:07 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [NOTICE]   (264782) : path to executable is /usr/sbin/haproxy
Jan 20 09:49:07 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [WARNING]  (264782) : Exiting Master process...
Jan 20 09:49:07 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [ALERT]    (264782) : Current worker (264784) exited with code 143 (Terminated)
Jan 20 09:49:07 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[264778]: [WARNING]  (264782) : All workers exited. Exiting... (0)
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.612 225859 DEBUG nova.virt.libvirt.vif [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:45:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1202945337',display_name='tempest-ServerActionsTestJSON-server-1202945337',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1202945337',id=94,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:48:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-luaqa362',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:49:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=75736b87-b14e-45b7-b43b-5129cf7d3279,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.612 225859 DEBUG nova.network.os_vif_util [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "address": "fa:16:3e:22:f9:d2", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a9a684-c9", "ovs_interfaceid": "d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.613 225859 DEBUG nova.network.os_vif_util [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.613 225859 DEBUG os_vif [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.615 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.615 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a9a684-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 systemd[1]: libpod-ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03.scope: Deactivated successfully.
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 podman[264851]: 2026-01-20 14:49:07.62024937 +0000 UTC m=+0.058222515 container died ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.621 225859 INFO os_vif [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a9a684-c9')#033[00m
Jan 20 09:49:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03-userdata-shm.mount: Deactivated successfully.
Jan 20 09:49:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay-55ffadcb61cfffe61ed679266c9eaf04220f431cb7218fbabad4d52c6fc4d512-merged.mount: Deactivated successfully.
Jan 20 09:49:07 np0005588919 podman[264851]: 2026-01-20 14:49:07.664136299 +0000 UTC m=+0.102109444 container cleanup ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:49:07 np0005588919 systemd[1]: libpod-conmon-ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03.scope: Deactivated successfully.
Jan 20 09:49:07 np0005588919 podman[264910]: 2026-01-20 14:49:07.72546094 +0000 UTC m=+0.042112350 container remove ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.730 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[646f28a9-84b1-446a-80e4-b71d9f055138]: (4, ('Tue Jan 20 02:49:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03)\nee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03\nTue Jan 20 02:49:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (ee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03)\nee1710db7c595c0c81475ce7f12613d2dc3b797c65c02c5878ca7132755baf03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.732 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[465a40b6-ef20-4fd4-828c-0e2170fe4144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.733 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:07 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.735 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 nova_compute[225855]: 2026-01-20 14:49:07.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.754 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[422e665f-6d13-4fcc-86ad-c65c832cd366]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.766 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[920026fc-3902-4b41-bc0f-8193a0b1c677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.767 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6e8402-8ba4-4acb-a4d0-093f87490a1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.786 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2aff16-742d-4e21-978b-ddbe977d1164]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557233, 'reachable_time': 30637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264925, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.789 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:49:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:07.789 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[072e108f-485a-49e9-bf8f-dfe5512946e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:07 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.088 225859 DEBUG nova.compute.manager [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.089 225859 DEBUG oslo_concurrency.lockutils [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.089 225859 DEBUG oslo_concurrency.lockutils [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.089 225859 DEBUG oslo_concurrency.lockutils [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.089 225859 DEBUG nova.compute.manager [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.089 225859 DEBUG nova.compute.manager [req-1f5aa4b2-4c37-479c-87c2-ec511ad8fe27 req-2d662357-cf98-4ed9-a5ae-55130c883826 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-unplugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.221 225859 INFO nova.virt.libvirt.driver [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Deleting instance files /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279_del#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.221 225859 INFO nova.virt.libvirt.driver [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Deletion of /var/lib/nova/instances/75736b87-b14e-45b7-b43b-5129cf7d3279_del complete#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.295 225859 INFO nova.compute.manager [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.295 225859 DEBUG oslo.service.loopingcall [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.295 225859 DEBUG nova.compute.manager [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:49:08 np0005588919 nova_compute[225855]: 2026-01-20 14:49:08.295 225859 DEBUG nova.network.neutron [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:49:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:08.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:09.240 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 20 09:49:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:09.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.291 225859 DEBUG nova.compute.manager [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.292 225859 DEBUG oslo_concurrency.lockutils [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.292 225859 DEBUG oslo_concurrency.lockutils [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.292 225859 DEBUG oslo_concurrency.lockutils [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.292 225859 DEBUG nova.compute.manager [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] No waiting events found dispatching network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.293 225859 WARNING nova.compute.manager [req-18149ca3-5934-47fc-b6b4-80be69e6df90 req-be87d01f-bfcb-446e-bb0e-ab37c82e1d03 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received unexpected event network-vif-plugged-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.630 225859 DEBUG nova.network.neutron [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.653 225859 INFO nova.compute.manager [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Took 2.36 seconds to deallocate network for instance.#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.759 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.760 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.767 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:10.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.776 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.865 225859 INFO nova.scheduler.client.report [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Deleted allocations for instance 75736b87-b14e-45b7-b43b-5129cf7d3279#033[00m
Jan 20 09:49:10 np0005588919 nova_compute[225855]: 2026-01-20 14:49:10.946 225859 DEBUG oslo_concurrency.lockutils [None req-d8dff4d9-0f65-4fc2-a251-6a9f41e97b0f 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "75736b87-b14e-45b7-b43b-5129cf7d3279" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:11 np0005588919 nova_compute[225855]: 2026-01-20 14:49:11.051 225859 DEBUG nova.compute.manager [req-f8b41801-f187-4b08-b871-3094109b926d req-f98e7df1-1fb5-4220-89b9-1ff17541ab00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Received event network-vif-deleted-d3a9a684-c9a7-4abc-a085-9dcd17bfc2e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:12 np0005588919 nova_compute[225855]: 2026-01-20 14:49:12.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:12.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:14.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:15 np0005588919 podman[264981]: 2026-01-20 14:49:15.021916779 +0000 UTC m=+0.061875198 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 09:49:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:15.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:15 np0005588919 nova_compute[225855]: 2026-01-20 14:49:15.778 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:16 np0005588919 nova_compute[225855]: 2026-01-20 14:49:16.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:16.409 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:16.410 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:16.410 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:16 np0005588919 nova_compute[225855]: 2026-01-20 14:49:16.950 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:17.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.618 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.619 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.639 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.798 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.799 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.805 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.805 225859 INFO nova.compute.claims [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:49:17 np0005588919 nova_compute[225855]: 2026-01-20 14:49:17.943 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3509042914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.527 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.584s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.533 225859 DEBUG nova.compute.provider_tree [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.552 225859 DEBUG nova.scheduler.client.report [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.575 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.576 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.627 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.628 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.656 225859 INFO nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.682 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:49:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:18.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.914 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.915 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.916 225859 INFO nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Creating image(s)#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.940 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.963 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.986 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:18 np0005588919 nova_compute[225855]: 2026-01-20 14:49:18.989 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.049 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.050 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.051 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.051 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.077 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.081 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.341 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.400 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] resizing rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.522 225859 DEBUG nova.policy [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e9278fdb9e645b7938f3edb20c4d3cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.530 225859 DEBUG nova.objects.instance [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.552 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.553 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Ensure instance console log exists: /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.554 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.554 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:19 np0005588919 nova_compute[225855]: 2026-01-20 14:49:19.555 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:19.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:20 np0005588919 nova_compute[225855]: 2026-01-20 14:49:20.780 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:20.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:21.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:22 np0005588919 nova_compute[225855]: 2026-01-20 14:49:22.578 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920547.5760756, 75736b87-b14e-45b7-b43b-5129cf7d3279 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:22 np0005588919 nova_compute[225855]: 2026-01-20 14:49:22.578 225859 INFO nova.compute.manager [-] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:49:22 np0005588919 nova_compute[225855]: 2026-01-20 14:49:22.600 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Successfully created port: 6855cb4f-4178-4447-af36-126ade033206 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:49:22 np0005588919 nova_compute[225855]: 2026-01-20 14:49:22.605 225859 DEBUG nova.compute.manager [None req-d7baab77-16c4-4e7e-8eb8-44506a890a56 - - - - - -] [instance: 75736b87-b14e-45b7-b43b-5129cf7d3279] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:22 np0005588919 nova_compute[225855]: 2026-01-20 14:49:22.622 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:22.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:23.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:24.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.830 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Successfully updated port: 6855cb4f-4178-4447-af36-126ade033206 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.850 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.850 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.850 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.983 225859 DEBUG nova.compute.manager [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-changed-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.984 225859 DEBUG nova.compute.manager [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Refreshing instance network info cache due to event network-changed-6855cb4f-4178-4447-af36-126ade033206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:24 np0005588919 nova_compute[225855]: 2026-01-20 14:49:24.984 225859 DEBUG oslo_concurrency.lockutils [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:25 np0005588919 nova_compute[225855]: 2026-01-20 14:49:25.161 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:49:25 np0005588919 nova_compute[225855]: 2026-01-20 14:49:25.357 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:25.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:25 np0005588919 nova_compute[225855]: 2026-01-20 14:49:25.782 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/617682879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.746 225859 DEBUG nova.network.neutron [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:26.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.800 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.801 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance network_info: |[{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.802 225859 DEBUG oslo_concurrency.lockutils [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.802 225859 DEBUG nova.network.neutron [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Refreshing network info cache for port 6855cb4f-4178-4447-af36-126ade033206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.807 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start _get_guest_xml network_info=[{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.813 225859 WARNING nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.820 225859 DEBUG nova.virt.libvirt.host [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.820 225859 DEBUG nova.virt.libvirt.host [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.824 225859 DEBUG nova.virt.libvirt.host [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.824 225859 DEBUG nova.virt.libvirt.host [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.825 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.825 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.825 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.826 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.827 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.827 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.827 225859 DEBUG nova.virt.hardware [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:49:26 np0005588919 nova_compute[225855]: 2026-01-20 14:49:26.829 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4113171858' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.288 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.311 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.315 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:27.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.625 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2640238559' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.775 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.776 225859 DEBUG nova.virt.libvirt.vif [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.777 225859 DEBUG nova.network.os_vif_util [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.778 225859 DEBUG nova.network.os_vif_util [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.779 225859 DEBUG nova.objects.instance [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.802 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <uuid>fdeb13eb-edb4-4bff-aeef-2671ba9d4618</uuid>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <name>instance-00000069</name>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestJSON-server-2012792656</nova:name>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:49:26</nova:creationTime>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <nova:port uuid="6855cb4f-4178-4447-af36-126ade033206">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="serial">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="uuid">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:4f:3f:20"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <target dev="tap6855cb4f-41"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/console.log" append="off"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:49:27 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:49:27 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:49:27 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:49:27 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.803 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Preparing to wait for external event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.804 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.804 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.804 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.805 225859 DEBUG nova.virt.libvirt.vif [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.805 225859 DEBUG nova.network.os_vif_util [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.806 225859 DEBUG nova.network.os_vif_util [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.807 225859 DEBUG os_vif [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.807 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.807 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.808 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.810 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.811 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6855cb4f-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.811 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6855cb4f-41, col_values=(('external_ids', {'iface-id': '6855cb4f-4178-4447-af36-126ade033206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:3f:20', 'vm-uuid': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:27 np0005588919 NetworkManager[49104]: <info>  [1768920567.8384] manager: (tap6855cb4f-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.837 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.840 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.842 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.843 225859 INFO os_vif [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.918 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.919 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.919 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:4f:3f:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.919 225859 INFO nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Using config drive#033[00m
Jan 20 09:49:27 np0005588919 nova_compute[225855]: 2026-01-20 14:49:27.945 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2854563677' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:28.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:28 np0005588919 nova_compute[225855]: 2026-01-20 14:49:28.864 225859 INFO nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Creating config drive at /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config#033[00m
Jan 20 09:49:28 np0005588919 nova_compute[225855]: 2026-01-20 14:49:28.872 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdx1xjkoz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.009 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdx1xjkoz" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.044 225859 DEBUG nova.storage.rbd_utils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rbd image fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.049 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.198 225859 DEBUG oslo_concurrency.processutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.199 225859 INFO nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Deleting local config drive /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/disk.config because it was imported into RBD.#033[00m
Jan 20 09:49:29 np0005588919 virtqemud[225396]: End of file while reading data: Input/output error
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.222 225859 DEBUG nova.network.neutron [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updated VIF entry in instance network info cache for port 6855cb4f-4178-4447-af36-126ade033206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.223 225859 DEBUG nova.network.neutron [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:29 np0005588919 kernel: tap6855cb4f-41: entered promiscuous mode
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.2537] manager: (tap6855cb4f-41): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Jan 20 09:49:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:29Z|00386|binding|INFO|Claiming lport 6855cb4f-4178-4447-af36-126ade033206 for this chassis.
Jan 20 09:49:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:29Z|00387|binding|INFO|6855cb4f-4178-4447-af36-126ade033206: Claiming fa:16:3e:4f:3f:20 10.100.0.12
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.255 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.258 225859 DEBUG oslo_concurrency.lockutils [req-5d4600d2-324d-4c3d-9ccb-031d08986f0d req-96153e43-6aba-458d-845c-a4772785b531 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.270 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:29Z|00388|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 ovn-installed in OVS
Jan 20 09:49:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:29Z|00389|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 up in Southbound
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.273 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.276 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.275 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 systemd-machined[194361]: New machine qemu-45-instance-00000069.
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.288 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed7e455-c12d-43dd-8271-f1407e17098d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.289 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.290 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.290 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e35025cd-c1f4-4bc6-8a9b-49466b605191]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.291 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[75c7c797-4a56-4b4a-b6a7-295d3c563fc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 systemd[1]: Started Virtual Machine qemu-45-instance-00000069.
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.302 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[9fedebcd-f6f4-449b-9271-bd9688a85772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 systemd-udevd[265337]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.313 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bbaf61d0-dab7-4abc-a08e-8cbdb95022b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.3217] device (tap6855cb4f-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.3226] device (tap6855cb4f-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.344 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7a461368-535a-43de-bf15-4adc6392e5a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.349 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbdc42b-9299-4e08-8e21-377bd05143a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 systemd-udevd[265344]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.3500] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/163)
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.379 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ee5e8c-780a-43ce-b0c0-68f6cee6a405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.382 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[aa81fde7-dfd5-487d-bb0f-bddb68e52246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.4049] device (tap762e1859-40): carrier: link connected
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.410 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9f443492-5488-47b3-b744-82963b45966e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.426 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[53b190dc-5d7d-43e7-9edf-1ff50e923a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560833, 'reachable_time': 26364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265413, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.439 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a646cc12-b68a-488c-bdd9-f408f7e7d528]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560833, 'tstamp': 560833}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265417, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.454 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2bbf9f93-562b-41ef-bd75-2b04fec56149]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560833, 'reachable_time': 26364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265419, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.482 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc773ba-94a3-4b04-b155-d33809b14cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.551 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[55534203-d7d6-493a-b9c2-756a91ced44f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.554 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.555 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.557 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:29 np0005588919 NetworkManager[49104]: <info>  [1768920569.5604] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 20 09:49:29 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.562 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.568 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:29Z|00390|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.569 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.572 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.575 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cd41b288-5364-49f3-b2f4-7b0d824b55c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.576 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:49:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:29.576 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:29.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.663 225859 DEBUG nova.compute.manager [req-6d57ca86-6392-482b-a026-819d87a8f42d req-03c56d4e-19af-4621-b239-d47eb44b5ab2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.664 225859 DEBUG oslo_concurrency.lockutils [req-6d57ca86-6392-482b-a026-819d87a8f42d req-03c56d4e-19af-4621-b239-d47eb44b5ab2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.664 225859 DEBUG oslo_concurrency.lockutils [req-6d57ca86-6392-482b-a026-819d87a8f42d req-03c56d4e-19af-4621-b239-d47eb44b5ab2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.664 225859 DEBUG oslo_concurrency.lockutils [req-6d57ca86-6392-482b-a026-819d87a8f42d req-03c56d4e-19af-4621-b239-d47eb44b5ab2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.665 225859 DEBUG nova.compute.manager [req-6d57ca86-6392-482b-a026-819d87a8f42d req-03c56d4e-19af-4621-b239-d47eb44b5ab2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Processing event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.923 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920569.9228275, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:29 np0005588919 podman[265493]: 2026-01-20 14:49:29.923928858 +0000 UTC m=+0.048790299 container create 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.924 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Started (Lifecycle Event)#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.929 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.934 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.937 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance spawned successfully.#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.937 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.965 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.968 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.969 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.969 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.970 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.970 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.970 225859 DEBUG nova.virt.libvirt.driver [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:49:29 np0005588919 systemd[1]: Started libpod-conmon-6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902.scope.
Jan 20 09:49:29 np0005588919 nova_compute[225855]: 2026-01-20 14:49:29.974 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:49:29 np0005588919 podman[265493]: 2026-01-20 14:49:29.897037649 +0000 UTC m=+0.021899110 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:49:30 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:49:30 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b9ea598c29d8f95feac6dbbb7186bfee63e25d53008923443ae675d2d1a3a93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.015 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.015 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920569.923164, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.016 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:49:30 np0005588919 podman[265493]: 2026-01-20 14:49:30.024413185 +0000 UTC m=+0.149274646 container init 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 09:49:30 np0005588919 podman[265493]: 2026-01-20 14:49:30.031085333 +0000 UTC m=+0.155946774 container start 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.042 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.045 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920569.9343972, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.046 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.052 225859 INFO nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Took 11.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.052 225859 DEBUG nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:30 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [NOTICE]   (265514) : New worker (265516) forked
Jan 20 09:49:30 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [NOTICE]   (265514) : Loading success.
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.065 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.068 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.109 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.148 225859 INFO nova.compute.manager [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Took 12.40 seconds to build instance.#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.168 225859 DEBUG oslo_concurrency.lockutils [None req-a8cdcb09-d473-426d-b0ef-0d4cc8c09fe2 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:30 np0005588919 nova_compute[225855]: 2026-01-20 14:49:30.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:30.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.372 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.372 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:31.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.908 225859 DEBUG nova.compute.manager [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.908 225859 DEBUG oslo_concurrency.lockutils [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.909 225859 DEBUG oslo_concurrency.lockutils [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.909 225859 DEBUG oslo_concurrency.lockutils [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.910 225859 DEBUG nova.compute.manager [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:31 np0005588919 nova_compute[225855]: 2026-01-20 14:49:31.910 225859 WARNING nova.compute.manager [req-c6a8c477-5ed9-401f-b340-4c4b10d37be4 req-1432fb4c-2214-47c0-9208-b7ea48468fb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.404 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.405 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.405 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:49:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:32.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:32 np0005588919 nova_compute[225855]: 2026-01-20 14:49:32.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:33.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:34 np0005588919 nova_compute[225855]: 2026-01-20 14:49:34.262 225859 DEBUG nova.compute.manager [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-changed-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:34 np0005588919 nova_compute[225855]: 2026-01-20 14:49:34.262 225859 DEBUG nova.compute.manager [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Refreshing instance network info cache due to event network-changed-6855cb4f-4178-4447-af36-126ade033206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:34 np0005588919 nova_compute[225855]: 2026-01-20 14:49:34.262 225859 DEBUG oslo_concurrency.lockutils [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:34 np0005588919 nova_compute[225855]: 2026-01-20 14:49:34.263 225859 DEBUG oslo_concurrency.lockutils [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:34 np0005588919 nova_compute[225855]: 2026-01-20 14:49:34.263 225859 DEBUG nova.network.neutron [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Refreshing network info cache for port 6855cb4f-4178-4447-af36-126ade033206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:34.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:35 np0005588919 podman[265527]: 2026-01-20 14:49:35.05305651 +0000 UTC m=+0.084525558 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:49:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:35.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:35 np0005588919 nova_compute[225855]: 2026-01-20 14:49:35.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:36 np0005588919 nova_compute[225855]: 2026-01-20 14:49:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:36.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:36 np0005588919 nova_compute[225855]: 2026-01-20 14:49:36.888 225859 DEBUG nova.network.neutron [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updated VIF entry in instance network info cache for port 6855cb4f-4178-4447-af36-126ade033206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:49:36 np0005588919 nova_compute[225855]: 2026-01-20 14:49:36.889 225859 DEBUG nova.network.neutron [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:36 np0005588919 nova_compute[225855]: 2026-01-20 14:49:36.926 225859 DEBUG oslo_concurrency.lockutils [req-a3c9f776-91dd-4a21-bdfc-563f84cdfd1c req-c3611bb3-4e3a-4067-97b8-ff5a0e079494 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.419 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.419 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.420 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.420 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.420 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:37.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.842 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2025183021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:37 np0005588919 nova_compute[225855]: 2026-01-20 14:49:37.952 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:37 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.039 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000069 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.040 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000069 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.205 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.207 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4292MB free_disk=20.935195922851562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.207 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.208 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.362 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance fdeb13eb-edb4-4bff-aeef-2671ba9d4618 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.363 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.363 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.414 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:38.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1374144073' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.854 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.859 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.897 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.936 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:49:38 np0005588919 nova_compute[225855]: 2026-01-20 14:49:38.937 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:49:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:49:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.376 225859 DEBUG nova.compute.manager [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.575 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.576 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.616 225859 DEBUG nova.objects.instance [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'pci_requests' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:39.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.635 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.636 225859 INFO nova.compute.claims [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.636 225859 DEBUG nova.objects.instance [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'resources' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.691 225859 DEBUG nova.objects.instance [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'pci_devices' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.809 225859 INFO nova.compute.resource_tracker [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updating resource usage from migration 4a873a64-1379-4cac-913e-e81f3f300ec7#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.810 225859 DEBUG nova.compute.resource_tracker [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Starting to track incoming migration 4a873a64-1379-4cac-913e-e81f3f300ec7 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:49:39 np0005588919 nova_compute[225855]: 2026-01-20 14:49:39.922 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/728206553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.387 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.396 225859 DEBUG nova.compute.provider_tree [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.418 225859 DEBUG nova.scheduler.client.report [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.451 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.452 225859 INFO nova.compute.manager [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Migrating#033[00m
Jan 20 09:49:40 np0005588919 nova_compute[225855]: 2026-01-20 14:49:40.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:40.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:41.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:41 np0005588919 nova_compute[225855]: 2026-01-20 14:49:41.933 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:41 np0005588919 nova_compute[225855]: 2026-01-20 14:49:41.933 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:41 np0005588919 nova_compute[225855]: 2026-01-20 14:49:41.934 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:49:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:42.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:42 np0005588919 nova_compute[225855]: 2026-01-20 14:49:42.847 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:43Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:3f:20 10.100.0.12
Jan 20 09:49:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:43Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:3f:20 10.100.0.12
Jan 20 09:49:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:43.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:43 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:49:43 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:49:43 np0005588919 systemd-logind[783]: New session 63 of user nova.
Jan 20 09:49:43 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:49:43 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:49:43 np0005588919 systemd[265882]: Queued start job for default target Main User Target.
Jan 20 09:49:43 np0005588919 systemd[265882]: Created slice User Application Slice.
Jan 20 09:49:43 np0005588919 systemd[265882]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:49:43 np0005588919 systemd[265882]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:49:43 np0005588919 systemd[265882]: Reached target Paths.
Jan 20 09:49:43 np0005588919 systemd[265882]: Reached target Timers.
Jan 20 09:49:43 np0005588919 systemd[265882]: Starting D-Bus User Message Bus Socket...
Jan 20 09:49:43 np0005588919 systemd[265882]: Starting Create User's Volatile Files and Directories...
Jan 20 09:49:43 np0005588919 systemd[265882]: Finished Create User's Volatile Files and Directories.
Jan 20 09:49:43 np0005588919 systemd[265882]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:49:43 np0005588919 systemd[265882]: Reached target Sockets.
Jan 20 09:49:43 np0005588919 systemd[265882]: Reached target Basic System.
Jan 20 09:49:43 np0005588919 systemd[265882]: Reached target Main User Target.
Jan 20 09:49:43 np0005588919 systemd[265882]: Startup finished in 176ms.
Jan 20 09:49:43 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:49:43 np0005588919 systemd[1]: Started Session 63 of User nova.
Jan 20 09:49:44 np0005588919 systemd[1]: session-63.scope: Deactivated successfully.
Jan 20 09:49:44 np0005588919 systemd-logind[783]: Session 63 logged out. Waiting for processes to exit.
Jan 20 09:49:44 np0005588919 systemd-logind[783]: Removed session 63.
Jan 20 09:49:44 np0005588919 systemd-logind[783]: New session 65 of user nova.
Jan 20 09:49:44 np0005588919 systemd[1]: Started Session 65 of User nova.
Jan 20 09:49:44 np0005588919 systemd[1]: session-65.scope: Deactivated successfully.
Jan 20 09:49:44 np0005588919 systemd-logind[783]: Session 65 logged out. Waiting for processes to exit.
Jan 20 09:49:44 np0005588919 systemd-logind[783]: Removed session 65.
Jan 20 09:49:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:44.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:49:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:45 np0005588919 nova_compute[225855]: 2026-01-20 14:49:45.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:46 np0005588919 podman[265955]: 2026-01-20 14:49:46.021609585 +0000 UTC m=+0.061468676 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:49:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:46.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:47.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:47.818 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:47.819 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:49:47 np0005588919 nova_compute[225855]: 2026-01-20 14:49:47.819 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:47 np0005588919 nova_compute[225855]: 2026-01-20 14:49:47.849 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:49:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:48.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:49:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:48.822 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.090 225859 DEBUG nova.compute.manager [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.090 225859 DEBUG oslo_concurrency.lockutils [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.090 225859 DEBUG oslo_concurrency.lockutils [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.091 225859 DEBUG oslo_concurrency.lockutils [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.091 225859 DEBUG nova.compute.manager [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.091 225859 WARNING nova.compute.manager [req-ab956100-da64-4328-aa1c-51e8ba0af36c req-e0293cd6-6f0d-403c-84eb-63a433002bb7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received unexpected event network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:49:49 np0005588919 nova_compute[225855]: 2026-01-20 14:49:49.574 225859 INFO nova.network.neutron [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updating port 8286e975-4b57-4b5a-9018-82187a854a2d with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:49:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:49.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:50.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:50 np0005588919 nova_compute[225855]: 2026-01-20 14:49:50.824 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.502 225859 DEBUG nova.compute.manager [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.502 225859 DEBUG oslo_concurrency.lockutils [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.503 225859 DEBUG oslo_concurrency.lockutils [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.503 225859 DEBUG oslo_concurrency.lockutils [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.503 225859 DEBUG nova.compute.manager [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.503 225859 WARNING nova.compute.manager [req-3e6b9d77-46e1-46f6-8611-3612121dda9d req-a1d4fdc6-a417-458e-a594-1368b2e84d06 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received unexpected event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:49:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:51.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.840 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.840 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquired lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:51 np0005588919 nova_compute[225855]: 2026-01-20 14:49:51.841 225859 DEBUG nova.network.neutron [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:49:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:52.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:52 np0005588919 nova_compute[225855]: 2026-01-20 14:49:52.852 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.178 225859 DEBUG nova.compute.manager [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.314 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.315 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.337 225859 DEBUG nova.objects.instance [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_requests' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.374 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.374 225859 INFO nova.compute.claims [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.375 225859 DEBUG nova.objects.instance [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.388 225859 DEBUG nova.objects.instance [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'pci_devices' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.458 225859 INFO nova.compute.resource_tracker [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating resource usage from migration d2604dec-40c2-43fa-9566-b7cf6ab6e7a7#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.557 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:53.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.656 225859 DEBUG nova.compute.manager [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-changed-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.656 225859 DEBUG nova.compute.manager [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Refreshing instance network info cache due to event network-changed-8286e975-4b57-4b5a-9018-82187a854a2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:49:53 np0005588919 nova_compute[225855]: 2026-01-20 14:49:53.657 225859 DEBUG oslo_concurrency.lockutils [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:49:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2195194502' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.003 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.009 225859 DEBUG nova.compute.provider_tree [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.035 225859 DEBUG nova.scheduler.client.report [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.057 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.058 225859 INFO nova.compute.manager [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Migrating#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.079 225859 DEBUG nova.network.neutron [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updating instance_info_cache with network_info: [{"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.114 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Releasing lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.119 225859 DEBUG oslo_concurrency.lockutils [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.119 225859 DEBUG nova.network.neutron [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Refreshing network info cache for port 8286e975-4b57-4b5a-9018-82187a854a2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.121 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.121 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.121 225859 DEBUG nova.network.neutron [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.221 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.223 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.223 225859 INFO nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Creating image(s)#033[00m
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.257 225859 DEBUG nova.storage.rbd_utils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] creating snapshot(nova-resize) on rbd image(52477e64-7989-4aa2-88e1-31600bfae2ef_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:49:54 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:49:54 np0005588919 systemd[265882]: Activating special unit Exit the Session...
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped target Main User Target.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped target Basic System.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped target Paths.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped target Sockets.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped target Timers.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:49:54 np0005588919 systemd[265882]: Closed D-Bus User Message Bus Socket.
Jan 20 09:49:54 np0005588919 systemd[265882]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:49:54 np0005588919 systemd[265882]: Removed slice User Application Slice.
Jan 20 09:49:54 np0005588919 systemd[265882]: Reached target Shutdown.
Jan 20 09:49:54 np0005588919 systemd[265882]: Finished Exit the Session.
Jan 20 09:49:54 np0005588919 systemd[265882]: Reached target Exit the Session.
Jan 20 09:49:54 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:49:54 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:49:54 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:49:54 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:49:54 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:49:54 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:49:54 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:49:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 20 09:49:54 np0005588919 nova_compute[225855]: 2026-01-20 14:49:54.670 225859 DEBUG nova.objects.instance [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:49:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:54.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.239 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.239 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Ensure instance console log exists: /var/lib/nova/instances/52477e64-7989-4aa2-88e1-31600bfae2ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.240 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.241 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.241 225859 DEBUG oslo_concurrency.lockutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.244 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Start _get_guest_xml network_info=[{"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:19:a9:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.250 225859 WARNING nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.267 225859 DEBUG nova.virt.libvirt.host [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.269 225859 DEBUG nova.virt.libvirt.host [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.276 225859 DEBUG nova.virt.libvirt.host [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.277 225859 DEBUG nova.virt.libvirt.host [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.279 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.280 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.281 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.281 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.282 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.282 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.283 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.283 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.284 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.284 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.284 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.285 225859 DEBUG nova.virt.hardware [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.286 225859 DEBUG nova.objects.instance [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.333 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:55.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:55 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/87628794' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.789 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:55 np0005588919 nova_compute[225855]: 2026-01-20 14:49:55.841 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:49:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:49:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/848721760' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.269 225859 DEBUG oslo_concurrency.processutils [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.271 225859 DEBUG nova.virt.libvirt.vif [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663251192',display_name='tempest-ServerDiskConfigTestJSON-server-1663251192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663251192',id=106,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-nykd0j3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:48Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=52477e64-7989-4aa2-88e1-31600bfae2ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:19:a9:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.272 225859 DEBUG nova.network.os_vif_util [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:19:a9:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.273 225859 DEBUG nova.network.os_vif_util [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.277 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <uuid>52477e64-7989-4aa2-88e1-31600bfae2ef</uuid>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <name>instance-0000006a</name>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1663251192</nova:name>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:49:55</nova:creationTime>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:user uuid="a1bd93d04cc4468abe1d5c61f5144191">tempest-ServerDiskConfigTestJSON-1806346246-project-member</nova:user>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:project uuid="acb30fbc0e3749e390d7f867060b5a2a">tempest-ServerDiskConfigTestJSON-1806346246</nova:project>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <nova:port uuid="8286e975-4b57-4b5a-9018-82187a854a2d">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="serial">52477e64-7989-4aa2-88e1-31600bfae2ef</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="uuid">52477e64-7989-4aa2-88e1-31600bfae2ef</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/52477e64-7989-4aa2-88e1-31600bfae2ef_disk">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/52477e64-7989-4aa2-88e1-31600bfae2ef_disk.config">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:19:a9:8c"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <target dev="tap8286e975-4b"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/52477e64-7989-4aa2-88e1-31600bfae2ef/console.log" append="off"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:49:56 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:49:56 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:49:56 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:49:56 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.279 225859 DEBUG nova.virt.libvirt.vif [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663251192',display_name='tempest-ServerDiskConfigTestJSON-server-1663251192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663251192',id=106,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-nykd0j3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:49:48Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=52477e64-7989-4aa2-88e1-31600bfae2ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:19:a9:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.280 225859 DEBUG nova.network.os_vif_util [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:19:a9:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.281 225859 DEBUG nova.network.os_vif_util [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.281 225859 DEBUG os_vif [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.283 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.284 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.288 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8286e975-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.289 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8286e975-4b, col_values=(('external_ids', {'iface-id': '8286e975-4b57-4b5a-9018-82187a854a2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:a9:8c', 'vm-uuid': '52477e64-7989-4aa2-88e1-31600bfae2ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.292 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.2931] manager: (tap8286e975-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.295 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.301 225859 INFO os_vif [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b')#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.384 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.385 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.385 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No VIF found with MAC fa:16:3e:19:a9:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.386 225859 INFO nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Using config drive#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.4839] manager: (tap8286e975-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 20 09:49:56 np0005588919 kernel: tap8286e975-4b: entered promiscuous mode
Jan 20 09:49:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:56Z|00391|binding|INFO|Claiming lport 8286e975-4b57-4b5a-9018-82187a854a2d for this chassis.
Jan 20 09:49:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:56Z|00392|binding|INFO|8286e975-4b57-4b5a-9018-82187a854a2d: Claiming fa:16:3e:19:a9:8c 10.100.0.6
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.487 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.499 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:a9:8c 10.100.0.6'], port_security=['fa:16:3e:19:a9:8c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '52477e64-7989-4aa2-88e1-31600bfae2ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=8286e975-4b57-4b5a-9018-82187a854a2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.501 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 8286e975-4b57-4b5a-9018-82187a854a2d in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 bound to our chassis#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.504 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25#033[00m
Jan 20 09:49:56 np0005588919 systemd-udevd[266216]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.514 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3d84cb-b751-441d-a8c7-6c69dc86d3a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.515 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3379e2b3-f1 in ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:49:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:56Z|00393|binding|INFO|Setting lport 8286e975-4b57-4b5a-9018-82187a854a2d up in Southbound
Jan 20 09:49:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:56Z|00394|binding|INFO|Setting lport 8286e975-4b57-4b5a-9018-82187a854a2d ovn-installed in OVS
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.517 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.517 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3379e2b3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.517 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6cdb7f-2a51-4cd2-b2ec-106770226966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.518 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ba72bb1a-decb-43fe-b701-ec107305bf96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.5280] device (tap8286e975-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.527 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.5294] device (tap8286e975-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.537 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5e28c6-2519-4752-9e42-539efdd9050c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 systemd-machined[194361]: New machine qemu-46-instance-0000006a.
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.550 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1d94e960-234f-4d18-8005-d284317ff21a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 systemd[1]: Started Virtual Machine qemu-46-instance-0000006a.
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.586 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[070b952a-3170-4917-821e-86bb1ea9f85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.591 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[af12d2a2-0dee-4a39-af44-9baa65fbd6f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.5925] manager: (tap3379e2b3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Jan 20 09:49:56 np0005588919 systemd-udevd[266220]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.628 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cc914986-3503-4c88-be99-a3528a0a4c62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.632 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[02206a43-5a15-4cf5-9b02-df728fa36a05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.6587] device (tap3379e2b3-f0): carrier: link connected
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.668 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[499a62eb-d370-48bc-aea6-f17636994e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.688 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9fac98a5-5892-4c8f-a9cf-51079fde6f49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563558, 'reachable_time': 23408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266250, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.704 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7e190b-5a06-4f33-9df6-706c4513cbf9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:86fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563558, 'tstamp': 563558}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266251, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.720 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f01ee486-c155-49ca-bf10-b57c2f3ef069]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563558, 'reachable_time': 23408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266252, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.734 225859 DEBUG nova.network.neutron [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updated VIF entry in instance network info cache for port 8286e975-4b57-4b5a-9018-82187a854a2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.735 225859 DEBUG nova.network.neutron [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updating instance_info_cache with network_info: [{"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.757 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4839a9-9e07-4198-9dab-38de1956bf95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.765 225859 DEBUG oslo_concurrency.lockutils [req-0bffe202-7fe9-492b-bcf2-9e81adca93b4 req-9be68d86-36e4-4aba-afff-4deea996d0d0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-52477e64-7989-4aa2-88e1-31600bfae2ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:56.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.851 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0334e3b4-dcdd-4c11-8818-c07d7b194fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.852 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.853 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.853 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3379e2b3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.897 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 NetworkManager[49104]: <info>  [1768920596.8988] manager: (tap3379e2b3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 20 09:49:56 np0005588919 kernel: tap3379e2b3-f0: entered promiscuous mode
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.901 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.904 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3379e2b3-f0, col_values=(('external_ids', {'iface-id': 'b32ddf23-a8dd-4e6d-a410-ccb24b214d35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.905 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.906 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:56 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:56Z|00395|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.906 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.909 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2c74f1c9-1190-4f18-ac2f-a804b83e4b91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.909 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:49:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:56.910 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'env', 'PROCESS_TAG=haproxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:49:56 np0005588919 nova_compute[225855]: 2026-01-20 14:49:56.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.017 225859 DEBUG nova.compute.manager [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.018 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920597.0175753, 52477e64-7989-4aa2-88e1-31600bfae2ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.018 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.026 225859 INFO nova.virt.libvirt.driver [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Instance running successfully.#033[00m
Jan 20 09:49:57 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.028 225859 DEBUG nova.virt.libvirt.guest [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.029 225859 DEBUG nova.virt.libvirt.driver [None req-daed7e12-e146-4686-9444-14f2e75c6ad9 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.053 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.056 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.122 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.123 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920597.0176432, 52477e64-7989-4aa2-88e1-31600bfae2ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.123 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] VM Started (Lifecycle Event)#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.161 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.164 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.246 225859 DEBUG nova.network.neutron [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.287 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:49:57 np0005588919 podman[266326]: 2026-01-20 14:49:57.306022529 +0000 UTC m=+0.093779898 container create 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:49:57 np0005588919 podman[266326]: 2026-01-20 14:49:57.233636616 +0000 UTC m=+0.021394005 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.376 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:49:57 np0005588919 nova_compute[225855]: 2026-01-20 14:49:57.379 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:49:57 np0005588919 systemd[1]: Started libpod-conmon-7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49.scope.
Jan 20 09:49:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:49:57 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:49:57 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48e67e17409de7385c0e5c04d479fefc117c5bbc0e8751e2852e2bf32fe6c3ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:49:57 np0005588919 podman[266326]: 2026-01-20 14:49:57.505317416 +0000 UTC m=+0.293074795 container init 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 09:49:57 np0005588919 podman[266326]: 2026-01-20 14:49:57.510884223 +0000 UTC m=+0.298641632 container start 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 09:49:57 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [NOTICE]   (266345) : New worker (266347) forked
Jan 20 09:49:57 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [NOTICE]   (266345) : Loading success.
Jan 20 09:49:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:57.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.086 225859 DEBUG nova.compute.manager [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.086 225859 DEBUG oslo_concurrency.lockutils [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.086 225859 DEBUG oslo_concurrency.lockutils [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.086 225859 DEBUG oslo_concurrency.lockutils [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.087 225859 DEBUG nova.compute.manager [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:49:58 np0005588919 nova_compute[225855]: 2026-01-20 14:49:58.087 225859 WARNING nova.compute.manager [req-baf2153e-8186-4fec-aa51-62f15ccad980 req-55bc6dfb-46b7-4520-8731-ddad6e0e4c46 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received unexpected event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:49:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:49:58.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:49:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:49:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:49:59.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:49:59 np0005588919 kernel: tap6855cb4f-41 (unregistering): left promiscuous mode
Jan 20 09:49:59 np0005588919 NetworkManager[49104]: <info>  [1768920599.9794] device (tap6855cb4f-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:49:59 np0005588919 nova_compute[225855]: 2026-01-20 14:49:59.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:59Z|00396|binding|INFO|Releasing lport 6855cb4f-4178-4447-af36-126ade033206 from this chassis (sb_readonly=0)
Jan 20 09:49:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:59Z|00397|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 down in Southbound
Jan 20 09:49:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:49:59Z|00398|binding|INFO|Removing iface tap6855cb4f-41 ovn-installed in OVS
Jan 20 09:49:59 np0005588919 nova_compute[225855]: 2026-01-20 14:49:59.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:49:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:59.994 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:49:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:59.996 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:49:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:49:59.999 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:49:59 np0005588919 nova_compute[225855]: 2026-01-20 14:49:59.999 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.000 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e78048c7-6504-46ca-a339-7f550c5365ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.001 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:50:00 np0005588919 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 20 09:50:00 np0005588919 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000069.scope: Consumed 14.237s CPU time.
Jan 20 09:50:00 np0005588919 systemd-machined[194361]: Machine qemu-45-instance-00000069 terminated.
Jan 20 09:50:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [NOTICE]   (265514) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [NOTICE]   (265514) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [WARNING]  (265514) : Exiting Master process...
Jan 20 09:50:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [ALERT]    (265514) : Current worker (265516) exited with code 143 (Terminated)
Jan 20 09:50:00 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[265510]: [WARNING]  (265514) : All workers exited. Exiting... (0)
Jan 20 09:50:00 np0005588919 systemd[1]: libpod-6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902.scope: Deactivated successfully.
Jan 20 09:50:00 np0005588919 podman[266383]: 2026-01-20 14:50:00.124003024 +0000 UTC m=+0.043561191 container died 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 20 09:50:00 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:00 np0005588919 systemd[1]: var-lib-containers-storage-overlay-5b9ea598c29d8f95feac6dbbb7186bfee63e25d53008923443ae675d2d1a3a93-merged.mount: Deactivated successfully.
Jan 20 09:50:00 np0005588919 podman[266383]: 2026-01-20 14:50:00.162338036 +0000 UTC m=+0.081896193 container cleanup 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:00 np0005588919 systemd[1]: libpod-conmon-6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902.scope: Deactivated successfully.
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.205 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 podman[266413]: 2026-01-20 14:50:00.238294591 +0000 UTC m=+0.051762023 container remove 6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.245 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[168ca7fa-99a6-4848-b5c9-0f88157a1906]: (4, ('Tue Jan 20 02:50:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902)\n6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902\nTue Jan 20 02:50:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902)\n6832a1e9e9fbd88072418ee5f45181e7252b473c2cddc2205a332351b82ef902\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.246 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc2f632-c19c-459c-beb7-fb96e0309acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.247 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:00 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.248 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.261 225859 DEBUG nova.compute.manager [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.262 225859 DEBUG oslo_concurrency.lockutils [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.262 225859 DEBUG oslo_concurrency.lockutils [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.263 225859 DEBUG oslo_concurrency.lockutils [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.263 225859 DEBUG nova.compute.manager [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.263 225859 WARNING nova.compute.manager [req-6439b544-5584-467c-92b7-5a2058bf7b71 req-ec47bd55-e9a1-4561-b20d-116bb2520b35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received unexpected event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.267 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6742d4ea-0e20-4e3e-bb41-8088dd4be3d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.279 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9a19f2-c827-409f-ba46-a420981afd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.283 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b12193d-e66e-4491-90cf-218adc63a748]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.297 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8d753c9a-17a7-49f4-b6a1-b52a9b4292f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560826, 'reachable_time': 41751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266441, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.299 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:00.299 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5eb5ae-d8c8-40e1-b5d6-07e679b5d4f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:00 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.396 225859 INFO nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.402 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance destroyed successfully.#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.403 225859 DEBUG nova.virt.libvirt.vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:49:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.403 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.403 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.404 225859 DEBUG os_vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.407 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.407 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6855cb4f-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.412 225859 INFO os_vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.416 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-00000069 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.416 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] skipping disk for instance-00000069 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.828 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:00.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.866 225859 DEBUG nova.compute.manager [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.867 225859 DEBUG oslo_concurrency.lockutils [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.867 225859 DEBUG oslo_concurrency.lockutils [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.867 225859 DEBUG oslo_concurrency.lockutils [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.867 225859 DEBUG nova.compute.manager [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.868 225859 WARNING nova.compute.manager [req-462e149f-6411-4175-8385-0b50623bc042 req-d545fb15-00ec-476c-9f22-3fd434e0c6a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:50:00 np0005588919 nova_compute[225855]: 2026-01-20 14:50:00.945 225859 DEBUG nova.network.neutron [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Port 6855cb4f-4178-4447-af36-126ade033206 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.102 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.102 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.103 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.420 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.421 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:01 np0005588919 nova_compute[225855]: 2026-01-20 14:50:01.421 225859 DEBUG nova.network.neutron [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:01.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:02.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.101 225859 DEBUG nova.compute.manager [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.101 225859 DEBUG oslo_concurrency.lockutils [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.101 225859 DEBUG oslo_concurrency.lockutils [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.102 225859 DEBUG oslo_concurrency.lockutils [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.102 225859 DEBUG nova.compute.manager [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.102 225859 WARNING nova.compute.manager [req-dd4dc49c-e7a9-466d-9669-09c2f672c44d req-1a5ddf2a-9eff-4ed7-bc3b-572d19fba915 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.471 225859 DEBUG nova.network.neutron [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.503 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.624 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.626 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.626 225859 INFO nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Creating image(s)#033[00m
Jan 20 09:50:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:03.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.662 225859 DEBUG nova.storage.rbd_utils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] creating snapshot(nova-resize) on rbd image(fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:50:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 20 09:50:03 np0005588919 nova_compute[225855]: 2026-01-20 14:50:03.952 225859 DEBUG nova.objects.instance [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'trusted_certs' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.066 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.066 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Ensure instance console log exists: /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.067 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.067 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.067 225859 DEBUG oslo_concurrency.lockutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.070 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start _get_guest_xml network_info=[{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.074 225859 WARNING nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.086 225859 DEBUG nova.virt.libvirt.host [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.087 225859 DEBUG nova.virt.libvirt.host [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.094 225859 DEBUG nova.virt.libvirt.host [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.094 225859 DEBUG nova.virt.libvirt.host [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.095 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.096 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.096 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.096 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.097 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.097 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.097 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.097 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.098 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.098 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.098 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.098 225859 DEBUG nova.virt.hardware [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.099 225859 DEBUG nova.objects.instance [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.119 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:04Z|00399|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.462 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2075039030' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.567 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.608 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:04Z|00400|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:50:04 np0005588919 nova_compute[225855]: 2026-01-20 14:50:04.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 20 09:50:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1676165450' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.050 225859 DEBUG oslo_concurrency.processutils [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.053 225859 DEBUG nova.virt.libvirt.vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.054 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.055 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.060 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <uuid>fdeb13eb-edb4-4bff-aeef-2671ba9d4618</uuid>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <name>instance-00000069</name>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestJSON-server-2012792656</nova:name>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:50:04</nova:creationTime>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <nova:port uuid="6855cb4f-4178-4447-af36-126ade033206">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="serial">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="uuid">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:4f:3f:20"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <target dev="tap6855cb4f-41"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/console.log" append="off"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:50:05 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:50:05 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:50:05 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:50:05 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.062 225859 DEBUG nova.virt.libvirt.vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.062 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1917526237-network", "vif_mac": "fa:16:3e:4f:3f:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.063 225859 DEBUG nova.network.os_vif_util [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.064 225859 DEBUG os_vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.065 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.066 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.066 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.070 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6855cb4f-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.071 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6855cb4f-41, col_values=(('external_ids', {'iface-id': '6855cb4f-4178-4447-af36-126ade033206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:3f:20', 'vm-uuid': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.0737] manager: (tap6855cb4f-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.077 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.078 225859 INFO os_vif [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.153 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.153 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.154 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] No VIF found with MAC fa:16:3e:4f:3f:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.155 225859 INFO nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Using config drive#033[00m
Jan 20 09:50:05 np0005588919 podman[266581]: 2026-01-20 14:50:05.224082394 +0000 UTC m=+0.106675142 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 09:50:05 np0005588919 kernel: tap6855cb4f-41: entered promiscuous mode
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.2460] manager: (tap6855cb4f-41): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00401|binding|INFO|Claiming lport 6855cb4f-4178-4447-af36-126ade033206 for this chassis.
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00402|binding|INFO|6855cb4f-4178-4447-af36-126ade033206: Claiming fa:16:3e:4f:3f:20 10.100.0.12
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.258 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 systemd-machined[194361]: New machine qemu-47-instance-00000069.
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.280 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.2822] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.2828] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.284 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:05 np0005588919 systemd[1]: Started Virtual Machine qemu-47-instance-00000069.
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.286 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.289 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.301 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5547a843-6a11-4d5c-887f-118cf30c2862]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.302 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.305 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.305 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8c62f464-bb68-4960-a889-d5f3f698d7e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.307 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd87729-5bd7-4a98-b77d-01dcdb898f95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 systemd-udevd[266640]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.321 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3757ad-39bb-43c3-b242-9aff85c213aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.3384] device (tap6855cb4f-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.3392] device (tap6855cb4f-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.349 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e8730987-bfb2-4242-b41c-8ecd5634e032]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.379 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f7066776-453e-476b-8b49-0cc56244aff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.3871] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.386 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a94f5ee3-d0ce-4a5e-830b-b2b3c9ec2ade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 systemd-udevd[266643]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.421 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[745e2ee9-63fa-4f80-af2a-c08ec3b856b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.425 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[71d4b5d5-f408-43bc-800d-3336f6d78002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.4451] device (tap762e1859-40): carrier: link connected
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.449 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3fd097-4ce9-492c-be8b-6bcdd8e82b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.468 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[28ceffc7-51f2-4e53-86da-7e0e1641d585]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564437, 'reachable_time': 22954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266671, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.485 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[90d2fe77-e85d-4687-a1f0-7c5143a9e82e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564437, 'tstamp': 564437}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266672, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00403|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00404|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 up in Southbound
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.502 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[31fd9935-ab99-4330-87ff-b82e605089bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564437, 'reachable_time': 22954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266673, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.524 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00405|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 ovn-installed in OVS
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.551 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3db07b07-2d89-463b-9ce7-65fb4d5b8431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.609 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdefb70-23bb-4ab4-a012-0094733b43da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.610 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.610 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.610 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.612 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:50:05 np0005588919 NetworkManager[49104]: <info>  [1768920605.6132] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.615 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:05Z|00406|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.618 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.620 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29cd99b0-abbd-4222-a88d-95beff064149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.621 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:05.622 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:05.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.906 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for fdeb13eb-edb4-4bff-aeef-2671ba9d4618 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.907 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920605.9062374, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.907 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.910 225859 DEBUG nova.compute.manager [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.914 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance running successfully.#033[00m
Jan 20 09:50:05 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.916 225859 DEBUG nova.virt.libvirt.guest [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:50:05 np0005588919 nova_compute[225855]: 2026-01-20 14:50:05.916 225859 DEBUG nova.virt.libvirt.driver [None req-c7d66eda-2386-42cf-82ee-49bf287fa76d 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:50:06 np0005588919 podman[266745]: 2026-01-20 14:50:06.026060245 +0000 UTC m=+0.054529430 container create 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:50:06 np0005588919 systemd[1]: Started libpod-conmon-5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3.scope.
Jan 20 09:50:06 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:50:06 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c88cd4daf2bc9c6dc43439ff4fce93da549f6cbad7034349a6852b667007fb0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:06 np0005588919 podman[266745]: 2026-01-20 14:50:05.998299512 +0000 UTC m=+0.026768727 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:06 np0005588919 podman[266745]: 2026-01-20 14:50:06.106468015 +0000 UTC m=+0.134937200 container init 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:50:06 np0005588919 podman[266745]: 2026-01-20 14:50:06.111868828 +0000 UTC m=+0.140338013 container start 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.127 225859 DEBUG nova.compute.manager [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.128 225859 DEBUG oslo_concurrency.lockutils [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.128 225859 DEBUG oslo_concurrency.lockutils [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.128 225859 DEBUG oslo_concurrency.lockutils [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.128 225859 DEBUG nova.compute.manager [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.129 225859 WARNING nova.compute.manager [req-1ad58b4c-22bd-43c8-9fd1-b5ae04bb3e08 req-92099aa4-02c1-4401-a220-3082698d89ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 09:50:06 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [NOTICE]   (266765) : New worker (266767) forked
Jan 20 09:50:06 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [NOTICE]   (266765) : Loading success.
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.135 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.140 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.183 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.183 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920605.909545, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.183 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Started (Lifecycle Event)#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.238 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.240 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:06 np0005588919 nova_compute[225855]: 2026-01-20 14:50:06.286 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:50:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:06.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:07.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.352 225859 DEBUG nova.compute.manager [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.353 225859 DEBUG oslo_concurrency.lockutils [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.353 225859 DEBUG oslo_concurrency.lockutils [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.353 225859 DEBUG oslo_concurrency.lockutils [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.354 225859 DEBUG nova.compute.manager [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:08 np0005588919 nova_compute[225855]: 2026-01-20 14:50:08.354 225859 WARNING nova.compute.manager [req-71321ab6-dee9-4566-bb70-e6c2b135fad2 req-59900e74-bcd3-4594-947d-3a6029833b7b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:50:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:08.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:09.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:09 np0005588919 nova_compute[225855]: 2026-01-20 14:50:09.833 225859 DEBUG nova.network.neutron [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Port 6855cb4f-4178-4447-af36-126ade033206 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 20 09:50:09 np0005588919 nova_compute[225855]: 2026-01-20 14:50:09.835 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:09 np0005588919 nova_compute[225855]: 2026-01-20 14:50:09.835 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:09 np0005588919 nova_compute[225855]: 2026-01-20 14:50:09.835 225859 DEBUG nova.network.neutron [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:10 np0005588919 nova_compute[225855]: 2026-01-20 14:50:10.110 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:10Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:a9:8c 10.100.0.6
Jan 20 09:50:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:10Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:a9:8c 10.100.0.6
Jan 20 09:50:10 np0005588919 nova_compute[225855]: 2026-01-20 14:50:10.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:10.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:11.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:12.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.303 225859 DEBUG nova.network.neutron [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.334 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:13 np0005588919 kernel: tap6855cb4f-41 (unregistering): left promiscuous mode
Jan 20 09:50:13 np0005588919 NetworkManager[49104]: <info>  [1768920613.4020] device (tap6855cb4f-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:13Z|00407|binding|INFO|Releasing lport 6855cb4f-4178-4447-af36-126ade033206 from this chassis (sb_readonly=0)
Jan 20 09:50:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:13Z|00408|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 down in Southbound
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:13Z|00409|binding|INFO|Removing iface tap6855cb4f-41 ovn-installed in OVS
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.423 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.426 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.430 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.431 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1921d084-8e37-4406-8e25-b8ee8424b630]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.432 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:50:13 np0005588919 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 20 09:50:13 np0005588919 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000069.scope: Consumed 8.272s CPU time.
Jan 20 09:50:13 np0005588919 systemd-machined[194361]: Machine qemu-47-instance-00000069 terminated.
Jan 20 09:50:13 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [NOTICE]   (266765) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:13 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [NOTICE]   (266765) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:13 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [WARNING]  (266765) : Exiting Master process...
Jan 20 09:50:13 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [ALERT]    (266765) : Current worker (266767) exited with code 143 (Terminated)
Jan 20 09:50:13 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[266761]: [WARNING]  (266765) : All workers exited. Exiting... (0)
Jan 20 09:50:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:50:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/276800458' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:50:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:50:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/276800458' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:50:13 np0005588919 systemd[1]: libpod-5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3.scope: Deactivated successfully.
Jan 20 09:50:13 np0005588919 podman[266854]: 2026-01-20 14:50:13.593890854 +0000 UTC m=+0.065901301 container died 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.597 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance destroyed successfully.#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.598 225859 DEBUG nova.objects.instance [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.613 225859 DEBUG nova.virt.libvirt.vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.615 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.616 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.616 225859 DEBUG os_vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.618 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6855cb4f-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.624 225859 INFO os_vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:50:13 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:13 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c88cd4daf2bc9c6dc43439ff4fce93da549f6cbad7034349a6852b667007fb0f-merged.mount: Deactivated successfully.
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.631 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.632 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:13 np0005588919 podman[266854]: 2026-01-20 14:50:13.63730409 +0000 UTC m=+0.109314537 container cleanup 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:13 np0005588919 systemd[1]: libpod-conmon-5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3.scope: Deactivated successfully.
Jan 20 09:50:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:13.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.653 225859 DEBUG nova.objects.instance [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'migration_context' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:13 np0005588919 podman[266893]: 2026-01-20 14:50:13.700985028 +0000 UTC m=+0.040974058 container remove 5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.706 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fe89cc8a-99b0-4a2a-9d91-9e09a7fc6533]: (4, ('Tue Jan 20 02:50:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3)\n5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3\nTue Jan 20 02:50:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3)\n5631707afb8ee465811db7586f33c72cad56b7fd83200a0452445ac0954ef0e3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.707 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b497b8-4e1d-456c-8b7e-18c4004103de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.708 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:13 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.724 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.727 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ebc013c2-2196-482d-a060-cc7b57f62bff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.741 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9795bc0f-e7a3-462e-8a8f-b994bbc6f820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.742 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[206aa3f4-42c7-427b-afe0-a2ac90cf94eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.757 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[63567e4e-3e9a-40db-9ba3-0089b3806d1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564430, 'reachable_time': 38669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266908, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.760 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:13.760 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[adea26ad-2c4b-4f9f-88c0-62d7ae8b79bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:13 np0005588919 nova_compute[225855]: 2026-01-20 14:50:13.822 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.266 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.273 225859 DEBUG nova.compute.provider_tree [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.327 225859 DEBUG nova.scheduler.client.report [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.401 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.585 225859 INFO nova.compute.manager [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Swapping old allocation on dict_keys(['bbb02880-a710-4ac1-8b2c-5c09765848d1']) held by migration d2604dec-40c2-43fa-9566-b7cf6ab6e7a7 for instance#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.662 225859 DEBUG nova.scheduler.client.report [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Overwriting current allocation {'allocations': {'bbb02880-a710-4ac1-8b2c-5c09765848d1': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}, 'generation': 58}}, 'project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'user_id': '3e9278fdb9e645b7938f3edb20c4d3cf', 'consumer_generation': 1} on consumer fdeb13eb-edb4-4bff-aeef-2671ba9d4618 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 20 09:50:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:14.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.976 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.977 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.978 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.979 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.979 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.981 225859 INFO nova.compute.manager [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Terminating instance#033[00m
Jan 20 09:50:14 np0005588919 nova_compute[225855]: 2026-01-20 14:50:14.983 225859 DEBUG nova.compute.manager [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:50:15 np0005588919 kernel: tap8286e975-4b (unregistering): left promiscuous mode
Jan 20 09:50:15 np0005588919 NetworkManager[49104]: <info>  [1768920615.0507] device (tap8286e975-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:15Z|00410|binding|INFO|Releasing lport 8286e975-4b57-4b5a-9018-82187a854a2d from this chassis (sb_readonly=0)
Jan 20 09:50:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:15Z|00411|binding|INFO|Setting lport 8286e975-4b57-4b5a-9018-82187a854a2d down in Southbound
Jan 20 09:50:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:15Z|00412|binding|INFO|Removing iface tap8286e975-4b ovn-installed in OVS
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.068 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:a9:8c 10.100.0.6'], port_security=['fa:16:3e:19:a9:8c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '52477e64-7989-4aa2-88e1-31600bfae2ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=8286e975-4b57-4b5a-9018-82187a854a2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.069 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 8286e975-4b57-4b5a-9018-82187a854a2d in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 unbound from our chassis#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.070 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.071 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f66de3dd-252e-4893-a181-5c0963858ab1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.071 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace which is not needed anymore#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.106 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.106 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquired lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.107 225859 DEBUG nova.network.neutron [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:15 np0005588919 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 20 09:50:15 np0005588919 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000006a.scope: Consumed 12.746s CPU time.
Jan 20 09:50:15 np0005588919 systemd-machined[194361]: Machine qemu-46-instance-0000006a terminated.
Jan 20 09:50:15 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [NOTICE]   (266345) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:15 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [NOTICE]   (266345) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:15 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [WARNING]  (266345) : Exiting Master process...
Jan 20 09:50:15 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [ALERT]    (266345) : Current worker (266347) exited with code 143 (Terminated)
Jan 20 09:50:15 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[266341]: [WARNING]  (266345) : All workers exited. Exiting... (0)
Jan 20 09:50:15 np0005588919 NetworkManager[49104]: <info>  [1768920615.2026] manager: (tap8286e975-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 20 09:50:15 np0005588919 systemd[1]: libpod-7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49.scope: Deactivated successfully.
Jan 20 09:50:15 np0005588919 podman[266953]: 2026-01-20 14:50:15.210526204 +0000 UTC m=+0.051355640 container died 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.219 225859 INFO nova.virt.libvirt.driver [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Instance destroyed successfully.#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.219 225859 DEBUG nova.objects.instance [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'resources' on Instance uuid 52477e64-7989-4aa2-88e1-31600bfae2ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.239 225859 DEBUG nova.virt.libvirt.vif [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663251192',display_name='tempest-ServerDiskConfigTestJSON-server-1663251192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663251192',id=106,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:49:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-nykd0j3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:05Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=52477e64-7989-4aa2-88e1-31600bfae2ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.239 225859 DEBUG nova.network.os_vif_util [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "8286e975-4b57-4b5a-9018-82187a854a2d", "address": "fa:16:3e:19:a9:8c", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8286e975-4b", "ovs_interfaceid": "8286e975-4b57-4b5a-9018-82187a854a2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.241 225859 DEBUG nova.network.os_vif_util [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.241 225859 DEBUG os_vif [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay-48e67e17409de7385c0e5c04d479fefc117c5bbc0e8751e2852e2bf32fe6c3ee-merged.mount: Deactivated successfully.
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.247 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8286e975-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.253 225859 INFO os_vif [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=8286e975-4b57-4b5a-9018-82187a854a2d,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8286e975-4b')#033[00m
Jan 20 09:50:15 np0005588919 podman[266953]: 2026-01-20 14:50:15.254691081 +0000 UTC m=+0.095520517 container cleanup 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:50:15 np0005588919 systemd[1]: libpod-conmon-7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49.scope: Deactivated successfully.
Jan 20 09:50:15 np0005588919 podman[267008]: 2026-01-20 14:50:15.322174026 +0000 UTC m=+0.042735657 container remove 7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.329 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f669b8f1-3a68-4cde-ae4c-5e9677d7f256]: (4, ('Tue Jan 20 02:50:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49)\n7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49\nTue Jan 20 02:50:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49)\n7e472ad75c7c5324371d8342e82f1f2120ae128110e20722e399ec6b1ff46d49\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.331 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c5adb3b8-bd5b-49a0-8b1b-1d146f2b5a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.331 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 kernel: tap3379e2b3-f0: left promiscuous mode
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.354 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.356 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aa21d3b8-e12e-46e6-b92a-b3082ab8f330]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.379 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[69f7a621-8489-4178-8a86-3f70246cfc66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.380 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0d929afa-0ad9-4487-8465-30a5f2d44daf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.398 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aac0478d-257f-4100-a91a-510d828e0a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563550, 'reachable_time': 38403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267027, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.400 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:15.400 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[72c5e031-b8ec-4870-a875-66251c30c6ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:15 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3379e2b3\x2dffb2\x2d4391\x2d969b\x2dc9dc51bfbe25.mount: Deactivated successfully.
Jan 20 09:50:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:15.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.667 225859 INFO nova.virt.libvirt.driver [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Deleting instance files /var/lib/nova/instances/52477e64-7989-4aa2-88e1-31600bfae2ef_del#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.667 225859 INFO nova.virt.libvirt.driver [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Deletion of /var/lib/nova/instances/52477e64-7989-4aa2-88e1-31600bfae2ef_del complete#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.739 225859 INFO nova.compute.manager [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.739 225859 DEBUG oslo.service.loopingcall [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.740 225859 DEBUG nova.compute.manager [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.740 225859 DEBUG nova.network.neutron [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:50:15 np0005588919 nova_compute[225855]: 2026-01-20 14:50:15.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:16.410 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:16.411 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:16.411 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:16.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.880 225859 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.881 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.881 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.881 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.881 225859 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.881 225859 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-unplugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 DEBUG oslo_concurrency.lockutils [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 DEBUG nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] No waiting events found dispatching network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:16 np0005588919 nova_compute[225855]: 2026-01-20 14:50:16.882 225859 WARNING nova.compute.manager [req-5461f9f8-25cf-4b91-aaa7-639b4f1ccca6 req-66c129e6-9d1d-419f-953d-b0be27ade3b9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received unexpected event network-vif-plugged-8286e975-4b57-4b5a-9018-82187a854a2d for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.009 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:17 np0005588919 podman[267031]: 2026-01-20 14:50:17.019748371 +0000 UTC m=+0.058522323 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 09:50:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.611 225859 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.611 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.612 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.612 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.613 225859 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.613 225859 WARNING nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.614 225859 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.614 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.615 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.615 225859 DEBUG oslo_concurrency.lockutils [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.615 225859 DEBUG nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:17 np0005588919 nova_compute[225855]: 2026-01-20 14:50:17.616 225859 WARNING nova.compute.manager [req-238b57a2-08fa-428f-926e-00dc3e4027cf req-e43ff5b0-466e-4dd1-86ab-2af9205b9631 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 20 09:50:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:17.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.195 225859 DEBUG nova.network.neutron [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.213 225859 INFO nova.compute.manager [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Took 2.47 seconds to deallocate network for instance.#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.275 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.275 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.286 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.329 225859 INFO nova.scheduler.client.report [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Deleted allocations for instance 52477e64-7989-4aa2-88e1-31600bfae2ef#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.371 225859 DEBUG nova.network.neutron [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.404 225859 DEBUG oslo_concurrency.lockutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Releasing lock "refresh_cache-fdeb13eb-edb4-4bff-aeef-2671ba9d4618" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.404 225859 DEBUG nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.440 225859 DEBUG oslo_concurrency.lockutils [None req-f42363f2-02ba-471b-bd6f-2f81d785ebce a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "52477e64-7989-4aa2-88e1-31600bfae2ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.485 225859 DEBUG nova.storage.rbd_utils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] rolling back rbd image(fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.599 225859 DEBUG nova.storage.rbd_utils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] removing snapshot(nova-resize) on rbd image(fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:50:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:18.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:18 np0005588919 nova_compute[225855]: 2026-01-20 14:50:18.997 225859 DEBUG nova.compute.manager [req-7db177bc-2502-44a0-a2b6-161baac839a2 req-8a43a978-0815-43af-86f9-0a18f0bcf7be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Received event network-vif-deleted-8286e975-4b57-4b5a-9018-82187a854a2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.241 225859 DEBUG nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start _get_guest_xml network_info=[{"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.245 225859 WARNING nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.251 225859 DEBUG nova.virt.libvirt.host [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.252 225859 DEBUG nova.virt.libvirt.host [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.256 225859 DEBUG nova.virt.libvirt.host [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.257 225859 DEBUG nova.virt.libvirt.host [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.258 225859 DEBUG nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.258 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.258 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.259 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.259 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.259 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.259 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.259 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.260 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.260 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.260 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.260 225859 DEBUG nova.virt.hardware [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.261 225859 DEBUG nova.objects.instance [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'vcpu_model' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.290 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.615 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.616 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.634 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:50:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:19.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.719 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.720 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.726 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.727 225859 INFO nova.compute.claims [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:50:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/172413703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.832 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.866 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:19 np0005588919 nova_compute[225855]: 2026-01-20 14:50:19.889 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4223586813' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.382 225859 DEBUG oslo_concurrency.processutils [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.385 225859 DEBUG nova.virt.libvirt.vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.385 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.386 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.389 225859 DEBUG nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <uuid>fdeb13eb-edb4-4bff-aeef-2671ba9d4618</uuid>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <name>instance-00000069</name>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestJSON-server-2012792656</nova:name>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:50:19</nova:creationTime>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:user uuid="3e9278fdb9e645b7938f3edb20c4d3cf">tempest-ServerActionsTestJSON-1020442335-project-member</nova:user>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:project uuid="1c5f03d46c0c4162a3b2f1530850bb6c">tempest-ServerActionsTestJSON-1020442335</nova:project>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <nova:port uuid="6855cb4f-4178-4447-af36-126ade033206">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="serial">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="uuid">fdeb13eb-edb4-4bff-aeef-2671ba9d4618</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_disk.config">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:4f:3f:20"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <target dev="tap6855cb4f-41"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618/console.log" append="off"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:50:20 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:50:20 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:50:20 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:50:20 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.391 225859 DEBUG nova.virt.libvirt.vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.392 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.393 225859 DEBUG nova.network.os_vif_util [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.393 225859 DEBUG os_vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.393 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.394 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.394 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.397 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.398 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6855cb4f-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.399 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6855cb4f-41, col_values=(('external_ids', {'iface-id': '6855cb4f-4178-4447-af36-126ade033206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:3f:20', 'vm-uuid': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.4027] manager: (tap6855cb4f-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.405 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.409 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.410 225859 INFO os_vif [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:50:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3610407819' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.543 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.547 225859 DEBUG nova.compute.provider_tree [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.577 225859 DEBUG nova.scheduler.client.report [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:20 np0005588919 kernel: tap6855cb4f-41: entered promiscuous mode
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.6133] manager: (tap6855cb4f-41): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Jan 20 09:50:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:20Z|00413|binding|INFO|Claiming lport 6855cb4f-4178-4447-af36-126ade033206 for this chassis.
Jan 20 09:50:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:20Z|00414|binding|INFO|6855cb4f-4178-4447-af36-126ade033206: Claiming fa:16:3e:4f:3f:20 10.100.0.12
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.614 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.615 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.624 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.625 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 bound to our chassis#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.627 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 762e1859-4db4-4d9e-b66f-d50316f80df4#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.629 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.640 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15a6def7-b306-43e4-9e04-218f92c12d57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.641 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap762e1859-41 in ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.643 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap762e1859-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.643 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e488f571-9b74-4488-86e2-cbf6c8e21ddb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.644 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c4036340-9956-404a-8bbb-41667030d3a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 systemd-udevd[267207]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:20 np0005588919 systemd-machined[194361]: New machine qemu-48-instance-00000069.
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.659 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8296d0e7-f9bd-41ee-8830-ad589aef3345]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.659 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:20Z|00415|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 ovn-installed in OVS
Jan 20 09:50:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:20Z|00416|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 up in Southbound
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.6631] device (tap6855cb4f-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.664 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.6651] device (tap6855cb4f-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:20 np0005588919 systemd[1]: Started Virtual Machine qemu-48-instance-00000069.
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.672 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99531cbf-2f43-460c-a16d-32f8f3378938]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.681 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.681 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.700 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ae7cf6-bbf2-4899-b586-54a2fa6261d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.7069] manager: (tap762e1859-40): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Jan 20 09:50:20 np0005588919 systemd-udevd[267210]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.706 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34d68dbc-1475-4c93-9d8a-0b712b2ee5d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.708 225859 INFO nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.727 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.738 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f2276c6c-896c-4218-8bef-fa033df5b1f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.740 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[59ae2534-9a53-43d1-8589-d436561d7ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.7602] device (tap762e1859-40): carrier: link connected
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.764 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2f3479-1221-470e-8481-5d103741918c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.780 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[09270f6b-16e5-4293-876b-10bcb0c03d2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565968, 'reachable_time': 26783, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267239, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.793 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9ccdf7-b171-4ea9-8d1c-defe4d197602]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:f1da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565968, 'tstamp': 565968}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267240, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.807 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[39363bd3-95e7-49fe-aa70-ee19da8a26e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap762e1859-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:f1:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565968, 'reachable_time': 26783, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267241, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.830 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.831 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.832 225859 INFO nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Creating image(s)#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.831 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9162b7-f05b-4535-98d6-2c3b13c4a267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:20.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.867 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.893 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.898 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1132eb-b211-45c1-8e83-0d663f0b482b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.900 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.900 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.901 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap762e1859-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 NetworkManager[49104]: <info>  [1768920620.9036] manager: (tap762e1859-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 20 09:50:20 np0005588919 kernel: tap762e1859-40: entered promiscuous mode
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.906 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap762e1859-40, col_values=(('external_ids', {'iface-id': '9e775c45-1646-436d-a0cb-a5b5ec356e1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:20 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:20Z|00417|binding|INFO|Releasing lport 9e775c45-1646-436d-a0cb-a5b5ec356e1b from this chassis (sb_readonly=0)
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.908 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.909 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9268ce-0707-4770-87d8-9d690690090f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.910 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/762e1859-4db4-4d9e-b66f-d50316f80df4.pid.haproxy
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 762e1859-4db4-4d9e-b66f-d50316f80df4
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:20.911 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'env', 'PROCESS_TAG=haproxy-762e1859-4db4-4d9e-b66f-d50316f80df4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/762e1859-4db4-4d9e-b66f-d50316f80df4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.927 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.932 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.964 225859 DEBUG nova.policy [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1bd93d04cc4468abe1d5c61f5144191', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.998 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.998 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:20 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.999 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:20.999 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.024 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.028 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d5167284-086d-4b37-98b0-3853baabf418_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.087 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for fdeb13eb-edb4-4bff-aeef-2671ba9d4618 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.088 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920621.0855782, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.088 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.090 225859 DEBUG nova.compute.manager [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.100 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance running successfully.#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.101 225859 DEBUG nova.virt.libvirt.driver [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.116 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.121 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.148 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.149 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920621.0872164, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.149 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Started (Lifecycle Event)#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.193 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.198 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.218 225859 INFO nova.compute.manager [None req-a32fc634-b215-4655-b844-13e732281d75 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance to original state: 'active'#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.229 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.292 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 d5167284-086d-4b37-98b0-3853baabf418_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:21 np0005588919 podman[267407]: 2026-01-20 14:50:21.297388414 +0000 UTC m=+0.050066974 container create 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:21 np0005588919 systemd[1]: Started libpod-conmon-77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552.scope.
Jan 20 09:50:21 np0005588919 podman[267407]: 2026-01-20 14:50:21.270456314 +0000 UTC m=+0.023134904 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:21 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:50:21 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e4bf7e696f3dd4d151e1f99ee68b890d05b859756f606d057e896cbb8a0594b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:21 np0005588919 podman[267407]: 2026-01-20 14:50:21.388055584 +0000 UTC m=+0.140734144 container init 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:50:21 np0005588919 podman[267407]: 2026-01-20 14:50:21.394841576 +0000 UTC m=+0.147520136 container start 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:21 np0005588919 nova_compute[225855]: 2026-01-20 14:50:21.398 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] resizing rbd image d5167284-086d-4b37-98b0-3853baabf418_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:50:21 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [NOTICE]   (267462) : New worker (267478) forked
Jan 20 09:50:21 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [NOTICE]   (267462) : Loading success.
Jan 20 09:50:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:21.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.116 225859 DEBUG nova.objects.instance [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'migration_context' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.303 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.303 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Ensure instance console log exists: /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.304 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.304 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.304 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:22 np0005588919 nova_compute[225855]: 2026-01-20 14:50:22.525 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Successfully created port: 86cabae0-8599-4330-b71c-91eb2e6b76d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:50:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:22.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:23.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:23 np0005588919 nova_compute[225855]: 2026-01-20 14:50:23.949 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Successfully updated port: 86cabae0-8599-4330-b71c-91eb2e6b76d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:50:23 np0005588919 nova_compute[225855]: 2026-01-20 14:50:23.969 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:23 np0005588919 nova_compute[225855]: 2026-01-20 14:50:23.970 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:23 np0005588919 nova_compute[225855]: 2026-01-20 14:50:23.970 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:24 np0005588919 nova_compute[225855]: 2026-01-20 14:50:24.026 225859 DEBUG nova.compute.manager [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-changed-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:24 np0005588919 nova_compute[225855]: 2026-01-20 14:50:24.027 225859 DEBUG nova.compute.manager [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Refreshing instance network info cache due to event network-changed-86cabae0-8599-4330-b71c-91eb2e6b76d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:50:24 np0005588919 nova_compute[225855]: 2026-01-20 14:50:24.027 225859 DEBUG oslo_concurrency.lockutils [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:24 np0005588919 nova_compute[225855]: 2026-01-20 14:50:24.160 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:50:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:24.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:25.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.840 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.908 225859 DEBUG nova.network.neutron [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.936 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.936 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance network_info: |[{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.937 225859 DEBUG oslo_concurrency.lockutils [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.937 225859 DEBUG nova.network.neutron [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Refreshing network info cache for port 86cabae0-8599-4330-b71c-91eb2e6b76d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.943 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start _get_guest_xml network_info=[{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.948 225859 WARNING nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.954 225859 DEBUG nova.virt.libvirt.host [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.955 225859 DEBUG nova.virt.libvirt.host [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.959 225859 DEBUG nova.virt.libvirt.host [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.960 225859 DEBUG nova.virt.libvirt.host [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.961 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.962 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.962 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.963 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.963 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.964 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.964 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.964 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.965 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.965 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.966 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.967 225859 DEBUG nova.virt.hardware [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:50:25 np0005588919 nova_compute[225855]: 2026-01-20 14:50:25.971 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.111 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.112 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.113 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.113 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.113 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.115 225859 INFO nova.compute.manager [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Terminating instance#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.117 225859 DEBUG nova.compute.manager [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.148 225859 DEBUG nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.149 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.149 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.150 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.150 225859 DEBUG nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.151 225859 WARNING nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.151 225859 DEBUG nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.151 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.152 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.152 225859 DEBUG oslo_concurrency.lockutils [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.153 225859 DEBUG nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:26 np0005588919 kernel: tap6855cb4f-41 (unregistering): left promiscuous mode
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.153 225859 WARNING nova.compute.manager [req-189f70e9-8765-4736-98cf-bae84dd0851d req-cd13bed3-b2fb-4b74-b7ca-a84746d67410 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:50:26 np0005588919 NetworkManager[49104]: <info>  [1768920626.1584] device (tap6855cb4f-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:26Z|00418|binding|INFO|Releasing lport 6855cb4f-4178-4447-af36-126ade033206 from this chassis (sb_readonly=0)
Jan 20 09:50:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:26Z|00419|binding|INFO|Setting lport 6855cb4f-4178-4447-af36-126ade033206 down in Southbound
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.168 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:26Z|00420|binding|INFO|Removing iface tap6855cb4f-41 ovn-installed in OVS
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.173 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.183 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:3f:20 10.100.0.12'], port_security=['fa:16:3e:4f:3f:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fdeb13eb-edb4-4bff-aeef-2671ba9d4618', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-762e1859-4db4-4d9e-b66f-d50316f80df4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c5f03d46c0c4162a3b2f1530850bb6c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '80535eda-fa59-4edc-8e3d-9bfea6517730', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2474a8ca-bb96-4cae-9133-23419b81a9fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6855cb4f-4178-4447-af36-126ade033206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.184 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6855cb4f-4178-4447-af36-126ade033206 in datapath 762e1859-4db4-4d9e-b66f-d50316f80df4 unbound from our chassis#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.186 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 762e1859-4db4-4d9e-b66f-d50316f80df4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.186 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1046d355-d0d0-4971-918a-36badc8972f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.187 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 namespace which is not needed anymore#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.195 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 20 09:50:26 np0005588919 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000069.scope: Consumed 5.707s CPU time.
Jan 20 09:50:26 np0005588919 systemd-machined[194361]: Machine qemu-48-instance-00000069 terminated.
Jan 20 09:50:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [NOTICE]   (267462) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [NOTICE]   (267462) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [WARNING]  (267462) : Exiting Master process...
Jan 20 09:50:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [ALERT]    (267462) : Current worker (267478) exited with code 143 (Terminated)
Jan 20 09:50:26 np0005588919 neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4[267440]: [WARNING]  (267462) : All workers exited. Exiting... (0)
Jan 20 09:50:26 np0005588919 systemd[1]: libpod-77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552.scope: Deactivated successfully.
Jan 20 09:50:26 np0005588919 podman[267556]: 2026-01-20 14:50:26.315336167 +0000 UTC m=+0.044087886 container died 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.345 225859 INFO nova.virt.libvirt.driver [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Instance destroyed successfully.#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.345 225859 DEBUG nova.objects.instance [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lazy-loading 'resources' on Instance uuid fdeb13eb-edb4-4bff-aeef-2671ba9d4618 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:26 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:26 np0005588919 systemd[1]: var-lib-containers-storage-overlay-3e4bf7e696f3dd4d151e1f99ee68b890d05b859756f606d057e896cbb8a0594b-merged.mount: Deactivated successfully.
Jan 20 09:50:26 np0005588919 podman[267556]: 2026-01-20 14:50:26.357213519 +0000 UTC m=+0.085965238 container cleanup 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:26 np0005588919 systemd[1]: libpod-conmon-77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552.scope: Deactivated successfully.
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.367 225859 DEBUG nova.virt.libvirt.vif [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:49:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2012792656',display_name='tempest-ServerActionsTestJSON-server-2012792656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-2012792656',id=105,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLgEzx5mLsSqRL7L9WKOzM+WdeJ40U103wY9H3VMZ41G/sN5tQtQSt9lXWKTyc6pt00bfKD0E9GPugNMpy+dzSSpK23o3CgadkzAfAvjQCeCPOSp3fX13FGApomGd1HRCQ==',key_name='tempest-keypair-1602241722',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1c5f03d46c0c4162a3b2f1530850bb6c',ramdisk_id='',reservation_id='r-c8be97ji',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1020442335',owner_user_name='tempest-ServerActionsTestJSON-1020442335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e9278fdb9e645b7938f3edb20c4d3cf',uuid=fdeb13eb-edb4-4bff-aeef-2671ba9d4618,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.367 225859 DEBUG nova.network.os_vif_util [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converting VIF {"id": "6855cb4f-4178-4447-af36-126ade033206", "address": "fa:16:3e:4f:3f:20", "network": {"id": "762e1859-4db4-4d9e-b66f-d50316f80df4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1917526237-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1c5f03d46c0c4162a3b2f1530850bb6c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6855cb4f-41", "ovs_interfaceid": "6855cb4f-4178-4447-af36-126ade033206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.368 225859 DEBUG nova.network.os_vif_util [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.369 225859 DEBUG os_vif [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.371 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.371 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6855cb4f-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.373 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.375 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.377 225859 INFO os_vif [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:3f:20,bridge_name='br-int',has_traffic_filtering=True,id=6855cb4f-4178-4447-af36-126ade033206,network=Network(762e1859-4db4-4d9e-b66f-d50316f80df4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6855cb4f-41')#033[00m
Jan 20 09:50:26 np0005588919 podman[267593]: 2026-01-20 14:50:26.420689581 +0000 UTC m=+0.041786600 container remove 77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.426 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0ad1e5-f3fd-4439-813b-fafa80cec591]: (4, ('Tue Jan 20 02:50:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552)\n77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552\nTue Jan 20 02:50:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 (77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552)\n77c1e4defe56387ddf8a0a85cd09f24ebb67ce03630c8150f6659c6c85bb6552\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.428 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3bfc06-33ba-454b-bb5b-626206412c58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.429 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap762e1859-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.430 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 kernel: tap762e1859-40: left promiscuous mode
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.444 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:50:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2577488927' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.446 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbd5b33-97b9-4847-9165-d2dcc687a541]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.458 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[84c9d80d-cd96-49ef-ad86-edb08e5ae557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.459 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[148c4d4c-cb0e-447d-ba3f-4af735b0cde1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.470 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.477 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7654c02b-c2c7-40ef-b5dc-07baeffceeb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565962, 'reachable_time': 39487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267626, 'error': None, 'target': 'ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.480 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-762e1859-4db4-4d9e-b66f-d50316f80df4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:26.480 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[b0da330c-4325-4077-8935-f455e3a2561b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:26 np0005588919 systemd[1]: run-netns-ovnmeta\x2d762e1859\x2d4db4\x2d4d9e\x2db66f\x2dd50316f80df4.mount: Deactivated successfully.
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.496 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.499 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.751 225859 INFO nova.virt.libvirt.driver [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Deleting instance files /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_del#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.753 225859 INFO nova.virt.libvirt.driver [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Deletion of /var/lib/nova/instances/fdeb13eb-edb4-4bff-aeef-2671ba9d4618_del complete#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.823 225859 INFO nova.compute.manager [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.823 225859 DEBUG oslo.service.loopingcall [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.824 225859 DEBUG nova.compute.manager [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.824 225859 DEBUG nova.network.neutron [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:50:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:26.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.912 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.915 225859 DEBUG nova.virt.libvirt.vif [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:20Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.915 225859 DEBUG nova.network.os_vif_util [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.916 225859 DEBUG nova.network.os_vif_util [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.919 225859 DEBUG nova.objects.instance [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'pci_devices' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.934 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <uuid>d5167284-086d-4b37-98b0-3853baabf418</uuid>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <name>instance-0000006d</name>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1906505493</nova:name>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:50:25</nova:creationTime>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:user uuid="a1bd93d04cc4468abe1d5c61f5144191">tempest-ServerDiskConfigTestJSON-1806346246-project-member</nova:user>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:project uuid="acb30fbc0e3749e390d7f867060b5a2a">tempest-ServerDiskConfigTestJSON-1806346246</nova:project>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <nova:port uuid="86cabae0-8599-4330-b71c-91eb2e6b76d8">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="serial">d5167284-086d-4b37-98b0-3853baabf418</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="uuid">d5167284-086d-4b37-98b0-3853baabf418</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5167284-086d-4b37-98b0-3853baabf418_disk">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5167284-086d-4b37-98b0-3853baabf418_disk.config">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:30:05:34"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <target dev="tap86cabae0-85"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/console.log" append="off"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:50:26 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:50:26 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:50:26 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:50:26 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.939 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Preparing to wait for external event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.940 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.941 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.941 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.942 225859 DEBUG nova.virt.libvirt.vif [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:20Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.943 225859 DEBUG nova.network.os_vif_util [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.944 225859 DEBUG nova.network.os_vif_util [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.945 225859 DEBUG os_vif [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.947 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.948 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.952 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.953 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86cabae0-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:26 np0005588919 nova_compute[225855]: 2026-01-20 14:50:26.954 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86cabae0-85, col_values=(('external_ids', {'iface-id': '86cabae0-8599-4330-b71c-91eb2e6b76d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:05:34', 'vm-uuid': 'd5167284-086d-4b37-98b0-3853baabf418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:27 np0005588919 NetworkManager[49104]: <info>  [1768920627.0070] manager: (tap86cabae0-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.016 225859 INFO os_vif [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85')#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.211 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.212 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.212 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No VIF found with MAC fa:16:3e:30:05:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.213 225859 INFO nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Using config drive#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.250 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:27.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.884 225859 INFO nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Creating config drive at /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.894 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpapn263wm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.928 225859 DEBUG nova.network.neutron [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updated VIF entry in instance network info cache for port 86cabae0-8599-4330-b71c-91eb2e6b76d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.930 225859 DEBUG nova.network.neutron [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:27 np0005588919 nova_compute[225855]: 2026-01-20 14:50:27.953 225859 DEBUG oslo_concurrency.lockutils [req-5c2ab044-1c73-4429-b3b0-ee4004432102 req-1ea80a80-520b-4504-815f-0d1408c99c2e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.005 225859 DEBUG nova.network.neutron [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.033 225859 INFO nova.compute.manager [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Took 1.21 seconds to deallocate network for instance.#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.033 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpapn263wm" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.122 225859 DEBUG nova.storage.rbd_utils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] rbd image d5167284-086d-4b37-98b0-3853baabf418_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.128 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config d5167284-086d-4b37-98b0-3853baabf418_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.155 225859 DEBUG nova.compute.manager [req-8fbe0e71-504a-41aa-bdd4-3671a5504300 req-17469b22-9505-4f3c-b0ab-2202d3c8ac5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-deleted-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.214 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.215 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.219 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.257 225859 DEBUG nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.258 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.258 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.258 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.259 225859 DEBUG nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.259 225859 WARNING nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-unplugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.259 225859 DEBUG nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.260 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.260 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.260 225859 DEBUG oslo_concurrency.lockutils [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.261 225859 DEBUG nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] No waiting events found dispatching network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.261 225859 WARNING nova.compute.manager [req-ee2a81e4-df19-4a56-bde7-2b39e6040aca req-d5b89990-8a5f-4a46-a2ce-b327d61d4035 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Received unexpected event network-vif-plugged-6855cb4f-4178-4447-af36-126ade033206 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.263 225859 INFO nova.scheduler.client.report [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Deleted allocations for instance fdeb13eb-edb4-4bff-aeef-2671ba9d4618#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.283 225859 DEBUG oslo_concurrency.processutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config d5167284-086d-4b37-98b0-3853baabf418_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.283 225859 INFO nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Deleting local config drive /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/disk.config because it was imported into RBD.#033[00m
Jan 20 09:50:28 np0005588919 kernel: tap86cabae0-85: entered promiscuous mode
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.3301] manager: (tap86cabae0-85): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Jan 20 09:50:28 np0005588919 systemd-udevd[267538]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:50:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:28Z|00421|binding|INFO|Claiming lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 for this chassis.
Jan 20 09:50:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:28Z|00422|binding|INFO|86cabae0-8599-4330-b71c-91eb2e6b76d8: Claiming fa:16:3e:30:05:34 10.100.0.3
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.330 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.335 225859 DEBUG oslo_concurrency.lockutils [None req-2c5030c6-41ba-4de4-a518-9b488a30855c 3e9278fdb9e645b7938f3edb20c4d3cf 1c5f03d46c0c4162a3b2f1530850bb6c - - default default] Lock "fdeb13eb-edb4-4bff-aeef-2671ba9d4618" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.337 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:05:34 10.100.0.3'], port_security=['fa:16:3e:30:05:34 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5167284-086d-4b37-98b0-3853baabf418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=86cabae0-8599-4330-b71c-91eb2e6b76d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.338 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 86cabae0-8599-4330-b71c-91eb2e6b76d8 in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 bound to our chassis#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.340 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25#033[00m
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.3434] device (tap86cabae0-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.3442] device (tap86cabae0-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:50:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:28Z|00423|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 ovn-installed in OVS
Jan 20 09:50:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:28Z|00424|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 up in Southbound
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.350 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.350 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a64921f7-51f6-406b-97a6-993a9cd6fdb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.351 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3379e2b3-f1 in ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.353 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.353 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3379e2b3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.354 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[48d08744-8721-4a89-89be-e2a7d5e869c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.355 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e647e381-1bfe-4428-945b-f7e300aeb7ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.365 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbd88b4-25c2-48fa-a320-7ff501d5e959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 systemd-machined[194361]: New machine qemu-49-instance-0000006d.
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.376 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[67ef57a4-9016-43e4-981e-dfb4b1814373]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 systemd[1]: Started Virtual Machine qemu-49-instance-0000006d.
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.401 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[62237bf6-d323-4151-8c84-4496ce098f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.405 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[634033a4-f27c-44be-857c-2584cbeb9abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.4069] manager: (tap3379e2b3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.437 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a4b857-6e6f-4fb1-8dde-a711efd096f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.440 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[764a0b4e-44d9-4b71-9286-cd23644a9b82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.4597] device (tap3379e2b3-f0): carrier: link connected
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.464 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[528ce9ca-6ef1-4b4e-8e26-3932db02f901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.479 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ac50602c-7e7d-4ef0-a18f-d46634a2059b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566738, 'reachable_time': 15831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267773, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.491 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[57683e64-b9d0-4bfc-bc5a-d200616d6646]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:86fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 566738, 'tstamp': 566738}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267774, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.504 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6fbfce60-221c-4cd9-9ef6-1dc4e7fc4115]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566738, 'reachable_time': 15831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267775, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.528 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfc5a90-f8f0-4d53-b92a-cab1b35fa632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.577 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8afac1b-2cba-455a-8afc-e20186bc8cf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.578 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.579 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.579 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3379e2b3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.581 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 NetworkManager[49104]: <info>  [1768920628.5817] manager: (tap3379e2b3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 20 09:50:28 np0005588919 kernel: tap3379e2b3-f0: entered promiscuous mode
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.585 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3379e2b3-f0, col_values=(('external_ids', {'iface-id': 'b32ddf23-a8dd-4e6d-a410-ccb24b214d35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:28Z|00425|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:50:28 np0005588919 nova_compute[225855]: 2026-01-20 14:50:28.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.602 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.603 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[66a243b6-0601-44cd-8389-15d8fefa0a38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.603 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:50:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:28.604 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'env', 'PROCESS_TAG=haproxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:50:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:28.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:28 np0005588919 podman[267823]: 2026-01-20 14:50:28.954881653 +0000 UTC m=+0.054831018 container create 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:50:29 np0005588919 systemd[1]: Started libpod-conmon-731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949.scope.
Jan 20 09:50:29 np0005588919 podman[267823]: 2026-01-20 14:50:28.929213729 +0000 UTC m=+0.029163124 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:50:29 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:50:29 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eeb75a2b56f63e1e54b32f2517ebb6272ef57e2b63ce8662cb464dfa749ef5b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.058 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920629.0572743, d5167284-086d-4b37-98b0-3853baabf418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.059 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Started (Lifecycle Event)#033[00m
Jan 20 09:50:29 np0005588919 podman[267823]: 2026-01-20 14:50:29.064538299 +0000 UTC m=+0.164487674 container init 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:50:29 np0005588919 podman[267823]: 2026-01-20 14:50:29.07096017 +0000 UTC m=+0.170909525 container start 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.081 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.088 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920629.0588503, d5167284-086d-4b37-98b0-3853baabf418 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.090 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:50:29 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [NOTICE]   (267868) : New worker (267870) forked
Jan 20 09:50:29 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [NOTICE]   (267868) : Loading success.
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.112 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.116 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:29 np0005588919 nova_compute[225855]: 2026-01-20 14:50:29.261 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 20 09:50:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.217 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920615.2164834, 52477e64-7989-4aa2-88e1-31600bfae2ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.218 225859 INFO nova.compute.manager [-] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.321 225859 DEBUG nova.compute.manager [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.321 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.322 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.322 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.323 225859 DEBUG nova.compute.manager [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Processing event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.323 225859 DEBUG nova.compute.manager [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.323 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.324 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.324 225859 DEBUG oslo_concurrency.lockutils [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.325 225859 DEBUG nova.compute.manager [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.325 225859 WARNING nova.compute.manager [req-41911b97-d75c-484f-a56a-e8e35ce25ebc req-74038d00-378c-4b00-9565-e96a39fac7de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.326 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.332 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920630.3318157, d5167284-086d-4b37-98b0-3853baabf418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.332 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.338 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.343 225859 INFO nova.virt.libvirt.driver [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance spawned successfully.#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.344 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.545 225859 DEBUG nova.compute.manager [None req-0c1dbbb2-011b-483d-aff5-9986902b6e38 - - - - - -] [instance: 52477e64-7989-4aa2-88e1-31600bfae2ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.551 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.556 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.556 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.557 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.557 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.558 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.558 225859 DEBUG nova.virt.libvirt.driver [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.563 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.637 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.658 225859 INFO nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Took 9.83 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.658 225859 DEBUG nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.748 225859 INFO nova.compute.manager [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Took 11.06 seconds to build instance.#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.780 225859 DEBUG oslo_concurrency.lockutils [None req-caeffa0b-6abc-4875-a420-9b31a2b0b774 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:30 np0005588919 nova_compute[225855]: 2026-01-20 14:50:30.843 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:30.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:32 np0005588919 nova_compute[225855]: 2026-01-20 14:50:32.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:32 np0005588919 nova_compute[225855]: 2026-01-20 14:50:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:32 np0005588919 nova_compute[225855]: 2026-01-20 14:50:32.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:32 np0005588919 nova_compute[225855]: 2026-01-20 14:50:32.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:50:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:32.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.359 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.359 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.359 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:50:33 np0005588919 nova_compute[225855]: 2026-01-20 14:50:33.360 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:33.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:34.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:34.999 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.020 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.025 225859 DEBUG nova.compute.manager [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.030 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:35.100 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:35.101 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.131 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.132 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.157 225859 DEBUG nova.objects.instance [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'pci_requests' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.183 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.183 225859 INFO nova.compute.claims [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.183 225859 DEBUG nova.objects.instance [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'resources' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.205 225859 DEBUG nova.objects.instance [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'pci_devices' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.261 225859 INFO nova.compute.resource_tracker [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating resource usage from migration 135cd3df-9888-44b9-a278-94d967ab2b1d#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.359 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:35.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/170477161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.806 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.812 225859 DEBUG nova.compute.provider_tree [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.835 225859 DEBUG nova.scheduler.client.report [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.845 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.860 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:35 np0005588919 nova_compute[225855]: 2026-01-20 14:50:35.861 225859 INFO nova.compute.manager [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Migrating#033[00m
Jan 20 09:50:36 np0005588919 nova_compute[225855]: 2026-01-20 14:50:36.071 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:36 np0005588919 nova_compute[225855]: 2026-01-20 14:50:36.073 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:36 np0005588919 nova_compute[225855]: 2026-01-20 14:50:36.073 225859 DEBUG nova.network.neutron [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:36 np0005588919 podman[267955]: 2026-01-20 14:50:36.090855639 +0000 UTC m=+0.129474176 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 09:50:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 20 09:50:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:36.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.363 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.364 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.364 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.365 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/547360188' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:37 np0005588919 nova_compute[225855]: 2026-01-20 14:50:37.824 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.197 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.198 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.352 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.353 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4273MB free_disk=20.93305206298828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.353 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.354 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.413 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Applying migration context for instance d5167284-086d-4b37-98b0-3853baabf418 as it has an incoming, in-progress migration 135cd3df-9888-44b9-a278-94d967ab2b1d. Migration status is pre-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.413 225859 INFO nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating resource usage from migration 135cd3df-9888-44b9-a278-94d967ab2b1d#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.430 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Migration 135cd3df-9888-44b9-a278-94d967ab2b1d is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.430 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance d5167284-086d-4b37-98b0-3853baabf418 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.431 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.431 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:50:38 np0005588919 nova_compute[225855]: 2026-01-20 14:50:38.627 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:50:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:38.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.008 225859 DEBUG nova.network.neutron [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.030 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:50:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:50:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2975627258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.050 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.061 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.092 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.128 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.129 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.154 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 09:50:39 np0005588919 nova_compute[225855]: 2026-01-20 14:50:39.158 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:50:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:39.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:40 np0005588919 nova_compute[225855]: 2026-01-20 14:50:40.848 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:40.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:41 np0005588919 nova_compute[225855]: 2026-01-20 14:50:41.343 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920626.342226, fdeb13eb-edb4-4bff-aeef-2671ba9d4618 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:50:41 np0005588919 nova_compute[225855]: 2026-01-20 14:50:41.344 225859 INFO nova.compute.manager [-] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:50:41 np0005588919 nova_compute[225855]: 2026-01-20 14:50:41.365 225859 DEBUG nova.compute.manager [None req-05b2f404-44cc-40a8-80c4-e9e94641f940 - - - - - -] [instance: fdeb13eb-edb4-4bff-aeef-2671ba9d4618] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:50:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:41.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:42 np0005588919 nova_compute[225855]: 2026-01-20 14:50:42.011 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:42 np0005588919 nova_compute[225855]: 2026-01-20 14:50:42.127 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:42 np0005588919 nova_compute[225855]: 2026-01-20 14:50:42.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:42 np0005588919 nova_compute[225855]: 2026-01-20 14:50:42.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:42.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:44.103 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:44Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:05:34 10.100.0.3
Jan 20 09:50:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:44Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:05:34 10.100.0.3
Jan 20 09:50:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 20 09:50:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:50:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:44.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:50:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:45.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:45 np0005588919 nova_compute[225855]: 2026-01-20 14:50:45.850 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 09:50:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:50:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:45 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:50:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:46.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:47 np0005588919 nova_compute[225855]: 2026-01-20 14:50:47.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:47.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:48 np0005588919 podman[268163]: 2026-01-20 14:50:48.041236322 +0000 UTC m=+0.083844968 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:50:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:49 np0005588919 nova_compute[225855]: 2026-01-20 14:50:49.201 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:50:49 np0005588919 nova_compute[225855]: 2026-01-20 14:50:49.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:50:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:49.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:50 np0005588919 nova_compute[225855]: 2026-01-20 14:50:50.853 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:50.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 kernel: tap86cabae0-85 (unregistering): left promiscuous mode
Jan 20 09:50:52 np0005588919 NetworkManager[49104]: <info>  [1768920652.0478] device (tap86cabae0-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:52Z|00426|binding|INFO|Releasing lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 from this chassis (sb_readonly=0)
Jan 20 09:50:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:52Z|00427|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 down in Southbound
Jan 20 09:50:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:50:52Z|00428|binding|INFO|Removing iface tap86cabae0-85 ovn-installed in OVS
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.104 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.104 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:05:34 10.100.0.3'], port_security=['fa:16:3e:30:05:34 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5167284-086d-4b37-98b0-3853baabf418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=86cabae0-8599-4330-b71c-91eb2e6b76d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.106 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 86cabae0-8599-4330-b71c-91eb2e6b76d8 in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 unbound from our chassis#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.108 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.109 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[21bb2ab8-484a-42c5-a83a-5b2ce5b3e359]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.110 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace which is not needed anymore#033[00m
Jan 20 09:50:52 np0005588919 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 20 09:50:52 np0005588919 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006d.scope: Consumed 13.958s CPU time.
Jan 20 09:50:52 np0005588919 systemd-machined[194361]: Machine qemu-49-instance-0000006d terminated.
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [NOTICE]   (267868) : haproxy version is 2.8.14-c23fe91
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [NOTICE]   (267868) : path to executable is /usr/sbin/haproxy
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [WARNING]  (267868) : Exiting Master process...
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [WARNING]  (267868) : Exiting Master process...
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [ALERT]    (267868) : Current worker (267870) exited with code 143 (Terminated)
Jan 20 09:50:52 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[267863]: [WARNING]  (267868) : All workers exited. Exiting... (0)
Jan 20 09:50:52 np0005588919 systemd[1]: libpod-731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949.scope: Deactivated successfully.
Jan 20 09:50:52 np0005588919 podman[268311]: 2026-01-20 14:50:52.235631914 +0000 UTC m=+0.045575008 container died 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:50:52 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949-userdata-shm.mount: Deactivated successfully.
Jan 20 09:50:52 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9eeb75a2b56f63e1e54b32f2517ebb6272ef57e2b63ce8662cb464dfa749ef5b-merged.mount: Deactivated successfully.
Jan 20 09:50:52 np0005588919 podman[268311]: 2026-01-20 14:50:52.279164293 +0000 UTC m=+0.089107387 container cleanup 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:52 np0005588919 systemd[1]: libpod-conmon-731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949.scope: Deactivated successfully.
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.338 225859 INFO nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.345 225859 INFO nova.virt.libvirt.driver [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance destroyed successfully.#033[00m
Jan 20 09:50:52 np0005588919 podman[268343]: 2026-01-20 14:50:52.345925458 +0000 UTC m=+0.045783674 container remove 731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.347 225859 DEBUG nova.virt.libvirt.vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:50:35Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.347 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.348 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.349 225859 DEBUG os_vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.352 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.353 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86cabae0-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.354 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7baec480-5fe1-4191-a76f-7f4dfc374430]: (4, ('Tue Jan 20 02:50:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949)\n731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949\nTue Jan 20 02:50:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949)\n731bfd77941a55fc6ba26854288cff85a728673e94493d76f8d51354912a4949\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.356 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.357 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b23a01e-398f-47bc-88ae-69dc4025aed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.358 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.359 225859 INFO os_vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85')#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.359 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 kernel: tap3379e2b3-f0: left promiscuous mode
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.365 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.366 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.389 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.392 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[db0246f8-ad38-40ae-8d67-01b611bf2880]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.415 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[949fdb46-500a-4249-ad9d-f45356a2d76c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.416 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6a267d-3869-46cf-b110-a281c26d4b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.433 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14e27d9b-6d6e-4f78-bc80-7a8d14d55688]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566732, 'reachable_time': 19078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268367, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.436 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:50:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:50:52.436 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d076fe63-98ce-4881-aa5c-2cda8e69f687]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:50:52 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3379e2b3\x2dffb2\x2d4391\x2d969b\x2dc9dc51bfbe25.mount: Deactivated successfully.
Jan 20 09:50:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.559 225859 DEBUG nova.network.neutron [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Port 86cabae0-8599-4330-b71c-91eb2e6b76d8 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.567 225859 DEBUG nova.compute.manager [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.567 225859 DEBUG oslo_concurrency.lockutils [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.568 225859 DEBUG oslo_concurrency.lockutils [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.568 225859 DEBUG oslo_concurrency.lockutils [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.568 225859 DEBUG nova.compute.manager [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.569 225859 WARNING nova.compute.manager [req-4c947333-3739-4fd8-a571-f7155aed0aca req-0e01e685-b93a-428d-ab11-d5a364545294 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 09:50:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:52 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.734 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.735 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.736 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:52.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.912 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.913 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:50:52 np0005588919 nova_compute[225855]: 2026-01-20 14:50:52.913 225859 DEBUG nova.network.neutron [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:50:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:53.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:54.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:50:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:55.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.713 225859 DEBUG nova.compute.manager [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.715 225859 DEBUG oslo_concurrency.lockutils [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.715 225859 DEBUG oslo_concurrency.lockutils [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.716 225859 DEBUG oslo_concurrency.lockutils [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.716 225859 DEBUG nova.compute.manager [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.717 225859 WARNING nova.compute.manager [req-b68ff91a-7207-483b-9d85-d9928c04c25e req-fc83dc6f-b2b1-4598-b05b-74579a970938 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 09:50:55 np0005588919 nova_compute[225855]: 2026-01-20 14:50:55.857 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:56.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:57 np0005588919 nova_compute[225855]: 2026-01-20 14:50:57.356 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:50:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:50:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:57.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:50:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:50:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:50:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:50:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:50:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:50:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:50:59.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:00 np0005588919 nova_compute[225855]: 2026-01-20 14:51:00.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:00.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:00 np0005588919 nova_compute[225855]: 2026-01-20 14:51:00.987 225859 DEBUG nova.network.neutron [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:01 np0005588919 nova_compute[225855]: 2026-01-20 14:51:01.009 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:01 np0005588919 nova_compute[225855]: 2026-01-20 14:51:01.174 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:51:01 np0005588919 nova_compute[225855]: 2026-01-20 14:51:01.176 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:51:01 np0005588919 nova_compute[225855]: 2026-01-20 14:51:01.177 225859 INFO nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Creating image(s)#033[00m
Jan 20 09:51:01 np0005588919 nova_compute[225855]: 2026-01-20 14:51:01.221 225859 DEBUG nova.storage.rbd_utils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] creating snapshot(nova-resize) on rbd image(d5167284-086d-4b37-98b0-3853baabf418_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:51:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:01.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.112 225859 DEBUG nova.objects.instance [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'trusted_certs' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.223 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.224 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Ensure instance console log exists: /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.224 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.225 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.225 225859 DEBUG oslo_concurrency.lockutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.227 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start _get_guest_xml network_info=[{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.231 225859 WARNING nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.236 225859 DEBUG nova.virt.libvirt.host [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.237 225859 DEBUG nova.virt.libvirt.host [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.240 225859 DEBUG nova.virt.libvirt.host [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.240 225859 DEBUG nova.virt.libvirt.host [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.241 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.241 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.242 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.242 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.242 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.243 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.243 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.243 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.243 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.244 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.244 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.244 225859 DEBUG nova.virt.hardware [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.244 225859 DEBUG nova.objects.instance [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'vcpu_model' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.258 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.398 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1368014388' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.722 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:02 np0005588919 nova_compute[225855]: 2026-01-20 14:51:02.756 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:02.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2263803824' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.199 225859 DEBUG oslo_concurrency.processutils [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.203 225859 DEBUG nova.virt.libvirt.vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:52Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.203 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.204 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.207 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <uuid>d5167284-086d-4b37-98b0-3853baabf418</uuid>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <name>instance-0000006d</name>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1906505493</nova:name>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:51:02</nova:creationTime>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:user uuid="a1bd93d04cc4468abe1d5c61f5144191">tempest-ServerDiskConfigTestJSON-1806346246-project-member</nova:user>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:project uuid="acb30fbc0e3749e390d7f867060b5a2a">tempest-ServerDiskConfigTestJSON-1806346246</nova:project>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <nova:port uuid="86cabae0-8599-4330-b71c-91eb2e6b76d8">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="serial">d5167284-086d-4b37-98b0-3853baabf418</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="uuid">d5167284-086d-4b37-98b0-3853baabf418</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5167284-086d-4b37-98b0-3853baabf418_disk">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/d5167284-086d-4b37-98b0-3853baabf418_disk.config">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:30:05:34"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <target dev="tap86cabae0-85"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418/console.log" append="off"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:51:03 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:51:03 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:51:03 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:51:03 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.208 225859 DEBUG nova.virt.libvirt.vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:50:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:50:52Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.209 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "vif_mac": "fa:16:3e:30:05:34"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.209 225859 DEBUG nova.network.os_vif_util [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.210 225859 DEBUG os_vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.211 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.211 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.214 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.214 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86cabae0-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.214 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86cabae0-85, col_values=(('external_ids', {'iface-id': '86cabae0-8599-4330-b71c-91eb2e6b76d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:05:34', 'vm-uuid': 'd5167284-086d-4b37-98b0-3853baabf418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.2170] manager: (tap86cabae0-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.222 225859 INFO os_vif [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85')#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.283 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.285 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.286 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] No VIF found with MAC fa:16:3e:30:05:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.288 225859 INFO nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Using config drive#033[00m
Jan 20 09:51:03 np0005588919 kernel: tap86cabae0-85: entered promiscuous mode
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.3676] manager: (tap86cabae0-85): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Jan 20 09:51:03 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:03Z|00429|binding|INFO|Claiming lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 for this chassis.
Jan 20 09:51:03 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:03Z|00430|binding|INFO|86cabae0-8599-4330-b71c-91eb2e6b76d8: Claiming fa:16:3e:30:05:34 10.100.0.3
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.369 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.374 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:05:34 10.100.0.3'], port_security=['fa:16:3e:30:05:34 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5167284-086d-4b37-98b0-3853baabf418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=86cabae0-8599-4330-b71c-91eb2e6b76d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.375 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 86cabae0-8599-4330-b71c-91eb2e6b76d8 in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 bound to our chassis#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.376 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25#033[00m
Jan 20 09:51:03 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:03Z|00431|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 ovn-installed in OVS
Jan 20 09:51:03 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:03Z|00432|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 up in Southbound
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.386 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.388 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9985c9-e96b-445a-92a4-f66563a39db9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.389 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3379e2b3-f1 in ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.390 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3379e2b3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.390 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51c5fcea-84ca-40bc-b3f9-6f7f0d45b486]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.391 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3efd6fee-5af3-4746-800b-c36064697b99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 systemd-udevd[268541]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.402 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3e15361f-7552-499f-bbe1-a5344302f285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 systemd-machined[194361]: New machine qemu-50-instance-0000006d.
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.4099] device (tap86cabae0-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.4108] device (tap86cabae0-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:51:03 np0005588919 systemd[1]: Started Virtual Machine qemu-50-instance-0000006d.
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.424 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd8b98b-53da-4aca-9038-16d9910084d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.457 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3cbc86fd-f43a-4c7e-b393-0759459510c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.461 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f4c982-4143-4a2f-8128-4bc11c29b8a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.4632] manager: (tap3379e2b3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Jan 20 09:51:03 np0005588919 systemd-udevd[268545]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.491 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d89c056b-1a51-448a-b0ae-47cab0376593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.494 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c0273420-b0e1-4859-867b-a1a25f431dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.5153] device (tap3379e2b3-f0): carrier: link connected
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.521 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[96df139a-a8bf-412d-8b87-94b33b37e1c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.535 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[255e83a4-bd58-4998-953a-123518e5e720]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570244, 'reachable_time': 28706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268573, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.548 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[461c1fb8-213d-47a4-9e4b-f03dc0dfcdf4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:86fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570244, 'tstamp': 570244}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268574, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.562 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[67e65a24-4540-47fb-90ab-11035345ea65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3379e2b3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:86:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570244, 'reachable_time': 28706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268575, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.590 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[13e5976e-7b9e-4971-bfd6-019277bf0eba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.652 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc85f52-87e1-4981-8fad-12c206acf211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.654 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.654 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.654 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3379e2b3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 kernel: tap3379e2b3-f0: entered promiscuous mode
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.656 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 NetworkManager[49104]: <info>  [1768920663.6580] manager: (tap3379e2b3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.659 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3379e2b3-f0, col_values=(('external_ids', {'iface-id': 'b32ddf23-a8dd-4e6d-a410-ccb24b214d35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.660 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:03Z|00433|binding|INFO|Releasing lport b32ddf23-a8dd-4e6d-a410-ccb24b214d35 from this chassis (sb_readonly=0)
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.675 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.676 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.677 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e125e7cb-ee19-46cc-bc7d-b36dc9573c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.678 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.pid.haproxy
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3379e2b3-ffb2-4391-969b-c9dc51bfbe25
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:51:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:03.679 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'env', 'PROCESS_TAG=haproxy-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3379e2b3-ffb2-4391-969b-c9dc51bfbe25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:51:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:03.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.823 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for d5167284-086d-4b37-98b0-3853baabf418 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.823 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920663.8228931, d5167284-086d-4b37-98b0-3853baabf418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.823 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.826 225859 DEBUG nova.compute.manager [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.829 225859 INFO nova.virt.libvirt.driver [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance running successfully.#033[00m
Jan 20 09:51:03 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.831 225859 DEBUG nova.virt.libvirt.guest [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.831 225859 DEBUG nova.virt.libvirt.driver [None req-2880dd92-9035-4a66-9e1e-40a92f2e0283 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.851 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.853 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.922 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.923 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920663.8260856, d5167284-086d-4b37-98b0-3853baabf418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.923 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.951 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:03 np0005588919 nova_compute[225855]: 2026-01-20 14:51:03.954 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:04 np0005588919 podman[268650]: 2026-01-20 14:51:04.025892657 +0000 UTC m=+0.024780560 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:51:04 np0005588919 podman[268650]: 2026-01-20 14:51:04.14428829 +0000 UTC m=+0.143176173 container create a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:51:04 np0005588919 systemd[1]: Started libpod-conmon-a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e.scope.
Jan 20 09:51:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:51:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b33755f1bd34c133af6276ef5124932271c289130919c6a5f4cceaadd0c9ef0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:51:04 np0005588919 podman[268650]: 2026-01-20 14:51:04.43050272 +0000 UTC m=+0.429390593 container init a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:51:04 np0005588919 podman[268650]: 2026-01-20 14:51:04.436474999 +0000 UTC m=+0.435362862 container start a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:51:04 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [NOTICE]   (268670) : New worker (268672) forked
Jan 20 09:51:04 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [NOTICE]   (268670) : Loading success.
Jan 20 09:51:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:04.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.610 225859 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.610 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.610 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.610 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.611 225859 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.611 225859 WARNING nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.611 225859 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.611 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.611 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.612 225859 DEBUG oslo_concurrency.lockutils [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.612 225859 DEBUG nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.612 225859 WARNING nova.compute.manager [req-c32b7f8d-9983-4b88-a00a-911e052a9a02 req-697eba0c-977a-4371-9d04-c0d632ca86ac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state resized and task_state None.#033[00m
Jan 20 09:51:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:05.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.859 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.860 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.860 225859 DEBUG nova.compute.manager [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Going to confirm migration 16 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 20 09:51:05 np0005588919 nova_compute[225855]: 2026-01-20 14:51:05.865 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:06 np0005588919 nova_compute[225855]: 2026-01-20 14:51:06.048 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:06 np0005588919 nova_compute[225855]: 2026-01-20 14:51:06.048 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquired lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:06 np0005588919 nova_compute[225855]: 2026-01-20 14:51:06.048 225859 DEBUG nova.network.neutron [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:06 np0005588919 nova_compute[225855]: 2026-01-20 14:51:06.048 225859 DEBUG nova.objects.instance [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'info_cache' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:06.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:07 np0005588919 podman[268682]: 2026-01-20 14:51:07.071697674 +0000 UTC m=+0.122156590 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 20 09:51:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:07.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:07 np0005588919 nova_compute[225855]: 2026-01-20 14:51:07.845 225859 DEBUG nova.network.neutron [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [{"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:07 np0005588919 nova_compute[225855]: 2026-01-20 14:51:07.871 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Releasing lock "refresh_cache-d5167284-086d-4b37-98b0-3853baabf418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:07 np0005588919 nova_compute[225855]: 2026-01-20 14:51:07.871 225859 DEBUG nova.objects.instance [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'migration_context' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:07 np0005588919 nova_compute[225855]: 2026-01-20 14:51:07.945 225859 DEBUG nova.storage.rbd_utils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] removing snapshot(nova-resize) on rbd image(d5167284-086d-4b37-98b0-3853baabf418_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:51:08 np0005588919 nova_compute[225855]: 2026-01-20 14:51:08.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:08.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.029 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.030 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.165 225859 DEBUG oslo_concurrency.processutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2382876353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.641 225859 DEBUG oslo_concurrency.processutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.648 225859 DEBUG nova.compute.provider_tree [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.665 225859 DEBUG nova.scheduler.client.report [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:09.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.718 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.837 225859 INFO nova.scheduler.client.report [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Deleted allocation for migration 135cd3df-9888-44b9-a278-94d967ab2b1d#033[00m
Jan 20 09:51:09 np0005588919 nova_compute[225855]: 2026-01-20 14:51:09.906 225859 DEBUG oslo_concurrency.lockutils [None req-a37ea1ed-39f6-433d-9760-570995ae8099 a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:10 np0005588919 nova_compute[225855]: 2026-01-20 14:51:10.867 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:10.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:11.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.118 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.119 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.119 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.119 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.119 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.120 225859 INFO nova.compute.manager [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Terminating instance#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.121 225859 DEBUG nova.compute.manager [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:51:13 np0005588919 kernel: tap86cabae0-85 (unregistering): left promiscuous mode
Jan 20 09:51:13 np0005588919 NetworkManager[49104]: <info>  [1768920673.1648] device (tap86cabae0-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:51:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:13Z|00434|binding|INFO|Releasing lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 from this chassis (sb_readonly=0)
Jan 20 09:51:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:13Z|00435|binding|INFO|Setting lport 86cabae0-8599-4330-b71c-91eb2e6b76d8 down in Southbound
Jan 20 09:51:13 np0005588919 ovn_controller[130490]: 2026-01-20T14:51:13Z|00436|binding|INFO|Removing iface tap86cabae0-85 ovn-installed in OVS
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.176 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.182 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:05:34 10.100.0.3'], port_security=['fa:16:3e:30:05:34 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5167284-086d-4b37-98b0-3853baabf418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acb30fbc0e3749e390d7f867060b5a2a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '19fab802-7db8-4c89-8f8e-8dcfc14d4627', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0e287ba-f88b-46f5-bb7f-3cc2a74be88e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=86cabae0-8599-4330-b71c-91eb2e6b76d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.183 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 86cabae0-8599-4330-b71c-91eb2e6b76d8 in datapath 3379e2b3-ffb2-4391-969b-c9dc51bfbe25 unbound from our chassis#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.184 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.185 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7efb5647-a214-4716-9897-2e7b8252e6c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.186 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 namespace which is not needed anymore#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.196 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.217 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 20 09:51:13 np0005588919 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006d.scope: Consumed 9.934s CPU time.
Jan 20 09:51:13 np0005588919 systemd-machined[194361]: Machine qemu-50-instance-0000006d terminated.
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [NOTICE]   (268670) : haproxy version is 2.8.14-c23fe91
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [NOTICE]   (268670) : path to executable is /usr/sbin/haproxy
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [WARNING]  (268670) : Exiting Master process...
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [WARNING]  (268670) : Exiting Master process...
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [ALERT]    (268670) : Current worker (268672) exited with code 143 (Terminated)
Jan 20 09:51:13 np0005588919 neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25[268665]: [WARNING]  (268670) : All workers exited. Exiting... (0)
Jan 20 09:51:13 np0005588919 systemd[1]: libpod-a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e.scope: Deactivated successfully.
Jan 20 09:51:13 np0005588919 podman[268843]: 2026-01-20 14:51:13.304541574 +0000 UTC m=+0.041512463 container died a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:51:13 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e-userdata-shm.mount: Deactivated successfully.
Jan 20 09:51:13 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9b33755f1bd34c133af6276ef5124932271c289130919c6a5f4cceaadd0c9ef0-merged.mount: Deactivated successfully.
Jan 20 09:51:13 np0005588919 podman[268843]: 2026-01-20 14:51:13.338510033 +0000 UTC m=+0.075480922 container cleanup a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.341 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 systemd[1]: libpod-conmon-a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e.scope: Deactivated successfully.
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.346 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.352 225859 INFO nova.virt.libvirt.driver [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Instance destroyed successfully.#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.353 225859 DEBUG nova.objects.instance [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lazy-loading 'resources' on Instance uuid d5167284-086d-4b37-98b0-3853baabf418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.366 225859 DEBUG nova.virt.libvirt.vif [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:50:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1906505493',display_name='tempest-ServerDiskConfigTestJSON-server-1906505493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1906505493',id=109,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:51:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acb30fbc0e3749e390d7f867060b5a2a',ramdisk_id='',reservation_id='r-5s3il9ii',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1806346246',owner_user_name='tempest-ServerDiskConfigTestJSON-1806346246-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:51:09Z,user_data=None,user_id='a1bd93d04cc4468abe1d5c61f5144191',uuid=d5167284-086d-4b37-98b0-3853baabf418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.366 225859 DEBUG nova.network.os_vif_util [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converting VIF {"id": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "address": "fa:16:3e:30:05:34", "network": {"id": "3379e2b3-ffb2-4391-969b-c9dc51bfbe25", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1112843240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acb30fbc0e3749e390d7f867060b5a2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86cabae0-85", "ovs_interfaceid": "86cabae0-8599-4330-b71c-91eb2e6b76d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.367 225859 DEBUG nova.network.os_vif_util [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.367 225859 DEBUG os_vif [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.368 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.369 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86cabae0-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.370 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.371 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.373 225859 INFO os_vif [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:05:34,bridge_name='br-int',has_traffic_filtering=True,id=86cabae0-8599-4330-b71c-91eb2e6b76d8,network=Network(3379e2b3-ffb2-4391-969b-c9dc51bfbe25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86cabae0-85')#033[00m
Jan 20 09:51:13 np0005588919 podman[268878]: 2026-01-20 14:51:13.401028938 +0000 UTC m=+0.041655727 container remove a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.406 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[01b53d45-c31f-4039-9015-5f76f3a586d4]: (4, ('Tue Jan 20 02:51:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e)\na5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e\nTue Jan 20 02:51:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 (a5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e)\na5869eca616e5cacd5c07a52572d7cd125c31406909fb3cffd818b208b05825e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.407 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[df590090-4500-4b4f-8e53-f49e5d51c238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.408 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3379e2b3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 kernel: tap3379e2b3-f0: left promiscuous mode
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.426 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.430 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[20d1292c-c6fd-401a-b041-d4e2f4c19ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.458 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c80827-b216-4851-817c-f933b7c3a1ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.460 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b53c2c-daa6-412d-b638-ecbedd6500b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.475 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5858ab84-f210-4b56-afa8-6d2bd6d5fa93]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570237, 'reachable_time': 35426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268915, 'error': None, 'target': 'ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.478 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3379e2b3-ffb2-4391-969b-c9dc51bfbe25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:51:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:13.478 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa06931-635c-4ec8-bc87-025b64eb1371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:51:13 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3379e2b3\x2dffb2\x2d4391\x2d969b\x2dc9dc51bfbe25.mount: Deactivated successfully.
Jan 20 09:51:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:51:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4280913832' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:51:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:51:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4280913832' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:51:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:13.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.938 225859 INFO nova.virt.libvirt.driver [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Deleting instance files /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418_del#033[00m
Jan 20 09:51:13 np0005588919 nova_compute[225855]: 2026-01-20 14:51:13.939 225859 INFO nova.virt.libvirt.driver [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Deletion of /var/lib/nova/instances/d5167284-086d-4b37-98b0-3853baabf418_del complete#033[00m
Jan 20 09:51:14 np0005588919 nova_compute[225855]: 2026-01-20 14:51:14.003 225859 INFO nova.compute.manager [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Took 0.88 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:51:14 np0005588919 nova_compute[225855]: 2026-01-20 14:51:14.003 225859 DEBUG oslo.service.loopingcall [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:51:14 np0005588919 nova_compute[225855]: 2026-01-20 14:51:14.004 225859 DEBUG nova.compute.manager [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:51:14 np0005588919 nova_compute[225855]: 2026-01-20 14:51:14.004 225859 DEBUG nova.network.neutron [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:51:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:14.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:14.944 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:51:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:14.945 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:51:14 np0005588919 nova_compute[225855]: 2026-01-20 14:51:14.945 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.269 225859 DEBUG nova.network.neutron [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.333 225859 INFO nova.compute.manager [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.561 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.561 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.561 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.562 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.562 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.562 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-unplugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.562 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.563 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "d5167284-086d-4b37-98b0-3853baabf418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.563 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.563 225859 DEBUG oslo_concurrency.lockutils [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.564 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] No waiting events found dispatching network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.564 225859 WARNING nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received unexpected event network-vif-plugged-86cabae0-8599-4330-b71c-91eb2e6b76d8 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.564 225859 DEBUG nova.compute.manager [req-b834e40f-4cf2-4316-ab73-cd0de4c34931 req-49f7b545-3f1f-4a9a-88ad-109a52132d73 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: d5167284-086d-4b37-98b0-3853baabf418] Received event network-vif-deleted-86cabae0-8599-4330-b71c-91eb2e6b76d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.565 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.566 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.571 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.635 225859 INFO nova.scheduler.client.report [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Deleted allocations for instance d5167284-086d-4b37-98b0-3853baabf418#033[00m
Jan 20 09:51:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:15.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.738 225859 DEBUG oslo_concurrency.lockutils [None req-ae481e4d-076c-4b49-a909-514aa77b1f1f a1bd93d04cc4468abe1d5c61f5144191 acb30fbc0e3749e390d7f867060b5a2a - - default default] Lock "d5167284-086d-4b37-98b0-3853baabf418" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:15 np0005588919 nova_compute[225855]: 2026-01-20 14:51:15.916 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:16.411 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:16.412 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:16.412 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:16.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:17.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:18 np0005588919 nova_compute[225855]: 2026-01-20 14:51:18.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:18.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:19 np0005588919 podman[268921]: 2026-01-20 14:51:19.004618174 +0000 UTC m=+0.054832759 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:51:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 20 09:51:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:19.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:20 np0005588919 nova_compute[225855]: 2026-01-20 14:51:20.920 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:20.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:21.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:22.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:23 np0005588919 nova_compute[225855]: 2026-01-20 14:51:23.418 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:23.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:51:24.947 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:25 np0005588919 nova_compute[225855]: 2026-01-20 14:51:25.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588919 nova_compute[225855]: 2026-01-20 14:51:25.561 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:25.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:25 np0005588919 nova_compute[225855]: 2026-01-20 14:51:25.922 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 20 09:51:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:27.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:27.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 20 09:51:28 np0005588919 nova_compute[225855]: 2026-01-20 14:51:28.351 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920673.3507633, d5167284-086d-4b37-98b0-3853baabf418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:28 np0005588919 nova_compute[225855]: 2026-01-20 14:51:28.351 225859 INFO nova.compute.manager [-] [instance: d5167284-086d-4b37-98b0-3853baabf418] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:51:28 np0005588919 nova_compute[225855]: 2026-01-20 14:51:28.380 225859 DEBUG nova.compute.manager [None req-95393680-3740-411d-b52c-e2f8bf3af950 - - - - - -] [instance: d5167284-086d-4b37-98b0-3853baabf418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:28 np0005588919 nova_compute[225855]: 2026-01-20 14:51:28.478 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 20 09:51:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:29.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:30 np0005588919 nova_compute[225855]: 2026-01-20 14:51:30.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:31.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:33.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:33 np0005588919 nova_compute[225855]: 2026-01-20 14:51:33.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:33.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:34 np0005588919 nova_compute[225855]: 2026-01-20 14:51:34.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:34 np0005588919 nova_compute[225855]: 2026-01-20 14:51:34.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:51:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:35.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:35.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:35 np0005588919 nova_compute[225855]: 2026-01-20 14:51:35.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:37.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:37 np0005588919 nova_compute[225855]: 2026-01-20 14:51:37.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:37.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:38 np0005588919 podman[269002]: 2026-01-20 14:51:38.036211969 +0000 UTC m=+0.080756601 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Jan 20 09:51:38 np0005588919 nova_compute[225855]: 2026-01-20 14:51:38.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:39.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.368 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.369 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 20 09:51:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:39.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1849351552' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.803 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.957 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.958 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4447MB free_disk=20.91362762451172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.959 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:39 np0005588919 nova_compute[225855]: 2026-01-20 14:51:39.959 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.039 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.039 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.065 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1423700390' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.512 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.517 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.542 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.567 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.568 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:40 np0005588919 nova_compute[225855]: 2026-01-20 14:51:40.927 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:41.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:41.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.122 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "932fd680-9aa0-49b4-9915-fa55104aaad7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.123 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.144 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.265 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.266 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.277 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.277 225859 INFO nova.compute.claims [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.403 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.570 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3514914750' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.829 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.837 225859 DEBUG nova.compute.provider_tree [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.867 225859 DEBUG nova.scheduler.client.report [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.901 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.902 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.965 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 20 09:51:42 np0005588919 nova_compute[225855]: 2026-01-20 14:51:42.982 225859 INFO nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.003 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:51:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:43.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.105 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.106 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.107 225859 INFO nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Creating image(s)#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.138 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.169 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.199 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.204 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.287 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.288 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.290 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.291 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.317 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.321 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 932fd680-9aa0-49b4-9915-fa55104aaad7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.357 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:43 np0005588919 nova_compute[225855]: 2026-01-20 14:51:43.547 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:43.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.082 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 932fd680-9aa0-49b4-9915-fa55104aaad7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.761s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.152 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] resizing rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.245 225859 DEBUG nova.objects.instance [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lazy-loading 'migration_context' on Instance uuid 932fd680-9aa0-49b4-9915-fa55104aaad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.276 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.276 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Ensure instance console log exists: /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.277 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.277 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.277 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.279 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.282 225859 WARNING nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.297 225859 DEBUG nova.virt.libvirt.host [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.298 225859 DEBUG nova.virt.libvirt.host [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.301 225859 DEBUG nova.virt.libvirt.host [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.301 225859 DEBUG nova.virt.libvirt.host [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.303 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.303 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.303 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.304 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.304 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.304 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.304 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.304 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.305 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.305 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.305 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.305 225859 DEBUG nova.virt.hardware [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.308 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:51:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/297072441' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.758 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.785 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:44 np0005588919 nova_compute[225855]: 2026-01-20 14:51:44.789 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:45.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/664632545' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.270 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.272 225859 DEBUG nova.objects.instance [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 932fd680-9aa0-49b4-9915-fa55104aaad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.333 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <uuid>932fd680-9aa0-49b4-9915-fa55104aaad7</uuid>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <name>instance-00000072</name>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerShowV247Test-server-597934545</nova:name>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:51:44</nova:creationTime>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:user uuid="cdcdce94e7354b3bafb34285408888b9">tempest-ServerShowV247Test-1508434892-project-member</nova:user>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <nova:project uuid="ecfc3366b9194864a3f15ce0114b5ee3">tempest-ServerShowV247Test-1508434892</nova:project>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="serial">932fd680-9aa0-49b4-9915-fa55104aaad7</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="uuid">932fd680-9aa0-49b4-9915-fa55104aaad7</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/932fd680-9aa0-49b4-9915-fa55104aaad7_disk">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/console.log" append="off"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:51:45 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:51:45 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:51:45 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:51:45 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.400 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.400 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.401 225859 INFO nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Using config drive#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.425 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:45.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.910 225859 INFO nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Creating config drive at /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.915 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc9kczawb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:45 np0005588919 nova_compute[225855]: 2026-01-20 14:51:45.941 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:46 np0005588919 nova_compute[225855]: 2026-01-20 14:51:46.051 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc9kczawb" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:46 np0005588919 nova_compute[225855]: 2026-01-20 14:51:46.079 225859 DEBUG nova.storage.rbd_utils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] rbd image 932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:46 np0005588919 nova_compute[225855]: 2026-01-20 14:51:46.082 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config 932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:46 np0005588919 nova_compute[225855]: 2026-01-20 14:51:46.241 225859 DEBUG oslo_concurrency.processutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config 932fd680-9aa0-49b4-9915-fa55104aaad7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:46 np0005588919 nova_compute[225855]: 2026-01-20 14:51:46.242 225859 INFO nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Deleting local config drive /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7/disk.config because it was imported into RBD.#033[00m
Jan 20 09:51:46 np0005588919 systemd-machined[194361]: New machine qemu-51-instance-00000072.
Jan 20 09:51:46 np0005588919 systemd[1]: Started Virtual Machine qemu-51-instance-00000072.
Jan 20 09:51:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:47.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.073 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920707.0728478, 932fd680-9aa0-49b4-9915-fa55104aaad7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.074 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.077 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.077 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.080 225859 INFO nova.virt.libvirt.driver [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Instance spawned successfully.#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.080 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.102 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.107 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.111 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.111 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.112 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.113 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.113 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.114 225859 DEBUG nova.virt.libvirt.driver [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.166 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.167 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920707.073806, 932fd680-9aa0-49b4-9915-fa55104aaad7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.167 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] VM Started (Lifecycle Event)#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.242 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.244 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.258 225859 INFO nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Took 4.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.259 225859 DEBUG nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.286 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.336 225859 INFO nova.compute.manager [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Took 5.14 seconds to build instance.#033[00m
Jan 20 09:51:47 np0005588919 nova_compute[225855]: 2026-01-20 14:51:47.353 225859 DEBUG oslo_concurrency.lockutils [None req-43830f2f-2208-45ef-bcaf-0627d14e2cd6 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:47.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:48 np0005588919 nova_compute[225855]: 2026-01-20 14:51:48.550 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:49.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:49.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:50 np0005588919 podman[269444]: 2026-01-20 14:51:50.002633844 +0000 UTC m=+0.048703796 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.023809) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710023896, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2497, "num_deletes": 258, "total_data_size": 5544581, "memory_usage": 5603312, "flush_reason": "Manual Compaction"}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710061396, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3643008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44416, "largest_seqno": 46908, "table_properties": {"data_size": 3632724, "index_size": 6522, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22189, "raw_average_key_size": 21, "raw_value_size": 3611937, "raw_average_value_size": 3430, "num_data_blocks": 281, "num_entries": 1053, "num_filter_entries": 1053, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920523, "oldest_key_time": 1768920523, "file_creation_time": 1768920710, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 37636 microseconds, and 8233 cpu microseconds.
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.061441) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3643008 bytes OK
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.061465) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.063487) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.063502) EVENT_LOG_v1 {"time_micros": 1768920710063498, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.063521) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5533444, prev total WAL file size 5533444, number of live WAL files 2.
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.064643) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3557KB)], [87(8750KB)]
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710064673, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 12603927, "oldest_snapshot_seqno": -1}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7212 keys, 10795138 bytes, temperature: kUnknown
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710185933, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 10795138, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10747224, "index_size": 28794, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18053, "raw_key_size": 185910, "raw_average_key_size": 25, "raw_value_size": 10618578, "raw_average_value_size": 1472, "num_data_blocks": 1139, "num_entries": 7212, "num_filter_entries": 7212, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920710, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.186189) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 10795138 bytes
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.187931) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.9 rd, 89.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.5 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(6.4) write-amplify(3.0) OK, records in: 7743, records dropped: 531 output_compression: NoCompression
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.187977) EVENT_LOG_v1 {"time_micros": 1768920710187960, "job": 54, "event": "compaction_finished", "compaction_time_micros": 121357, "compaction_time_cpu_micros": 30256, "output_level": 6, "num_output_files": 1, "total_output_size": 10795138, "num_input_records": 7743, "num_output_records": 7212, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710188791, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920710190237, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.064578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.190300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.190305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.190307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.190308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:51:50.190309) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:51:50 np0005588919 nova_compute[225855]: 2026-01-20 14:51:50.931 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:51.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:51.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:52 np0005588919 podman[269685]: 2026-01-20 14:51:52.828080819 +0000 UTC m=+0.063682979 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 20 09:51:52 np0005588919 podman[269685]: 2026-01-20 14:51:52.923233155 +0000 UTC m=+0.158835295 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:51:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:53.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:53 np0005588919 podman[269842]: 2026-01-20 14:51:53.472564704 +0000 UTC m=+0.059091709 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:51:53 np0005588919 podman[269842]: 2026-01-20 14:51:53.483151283 +0000 UTC m=+0.069678288 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 09:51:53 np0005588919 nova_compute[225855]: 2026-01-20 14:51:53.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:53 np0005588919 podman[269909]: 2026-01-20 14:51:53.67468956 +0000 UTC m=+0.048770158 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, name=keepalived, description=keepalived for Ceph, release=1793, vcs-type=git)
Jan 20 09:51:53 np0005588919 podman[269909]: 2026-01-20 14:51:53.689165269 +0000 UTC m=+0.063245867 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, distribution-scope=public, io.buildah.version=1.28.2)
Jan 20 09:51:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:53.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.119 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.120 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.143 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.221 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.222 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.227 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.227 225859 INFO nova.compute.claims [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.382 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:51:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/995231318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.844 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.853 225859 DEBUG nova.compute.provider_tree [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.872 225859 DEBUG nova.scheduler.client.report [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.902 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.903 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.951 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.952 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.970 225859 INFO nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:51:54 np0005588919 nova_compute[225855]: 2026-01-20 14:51:54.986 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:51:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:55.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.079 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.081 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.082 225859 INFO nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Creating image(s)#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.121 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.160 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.190 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.194 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.221 225859 DEBUG nova.policy [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd85d286ce6224326a0f4a15a06afbfea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.257 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.258 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.259 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.259 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.289 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.292 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 23ea4537-f03f-46de-881f-b979e232a3b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.569 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 23ea4537-f03f-46de-881f-b979e232a3b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.641 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] resizing rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.748 225859 DEBUG nova.objects.instance [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:55.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.767 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.768 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Ensure instance console log exists: /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.769 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.769 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.770 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:51:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:51:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:51:55 np0005588919 nova_compute[225855]: 2026-01-20 14:51:55.932 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:56 np0005588919 nova_compute[225855]: 2026-01-20 14:51:56.061 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Successfully created port: 234381ea-07b1-41fe-b3c1-be97ce6a3b64 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:51:56 np0005588919 nova_compute[225855]: 2026-01-20 14:51:56.948 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Successfully updated port: 234381ea-07b1-41fe-b3c1-be97ce6a3b64 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:51:56 np0005588919 nova_compute[225855]: 2026-01-20 14:51:56.961 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:56 np0005588919 nova_compute[225855]: 2026-01-20 14:51:56.961 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:56 np0005588919 nova_compute[225855]: 2026-01-20 14:51:56.962 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:51:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:57.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:57 np0005588919 nova_compute[225855]: 2026-01-20 14:51:57.072 225859 DEBUG nova.compute.manager [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-changed-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:51:57 np0005588919 nova_compute[225855]: 2026-01-20 14:51:57.072 225859 DEBUG nova.compute.manager [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Refreshing instance network info cache due to event network-changed-234381ea-07b1-41fe-b3c1-be97ce6a3b64. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:51:57 np0005588919 nova_compute[225855]: 2026-01-20 14:51:57.073 225859 DEBUG oslo_concurrency.lockutils [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:51:57 np0005588919 nova_compute[225855]: 2026-01-20 14:51:57.164 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:51:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:51:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:51:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:57.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.301 225859 DEBUG nova.network.neutron [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.329 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.329 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance network_info: |[{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.329 225859 DEBUG oslo_concurrency.lockutils [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.330 225859 DEBUG nova.network.neutron [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Refreshing network info cache for port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.332 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start _get_guest_xml network_info=[{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.336 225859 WARNING nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.341 225859 DEBUG nova.virt.libvirt.host [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.343 225859 DEBUG nova.virt.libvirt.host [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.351 225859 DEBUG nova.virt.libvirt.host [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.352 225859 DEBUG nova.virt.libvirt.host [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.354 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.355 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.355 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.356 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.356 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.357 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.357 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.358 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.358 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.358 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.359 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.359 225859 DEBUG nova.virt.hardware [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.364 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.556 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:51:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996025638' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.848 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.874 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:58 np0005588919 nova_compute[225855]: 2026-01-20 14:51:58.880 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:51:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:51:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:51:59.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.369 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.371 225859 DEBUG nova.virt.libvirt.vif [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-828759404',display_name='tempest-ServerStableDeviceRescueTest-server-828759404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-828759404',id=117,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-ztpzn050',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:55Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=23ea4537-f03f-46de-881f-b979e232a3b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.371 225859 DEBUG nova.network.os_vif_util [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.372 225859 DEBUG nova.network.os_vif_util [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.373 225859 DEBUG nova.objects.instance [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.391 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <uuid>23ea4537-f03f-46de-881f-b979e232a3b9</uuid>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <name>instance-00000075</name>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-828759404</nova:name>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:51:58</nova:creationTime>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <nova:port uuid="234381ea-07b1-41fe-b3c1-be97ce6a3b64">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="serial">23ea4537-f03f-46de-881f-b979e232a3b9</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="uuid">23ea4537-f03f-46de-881f-b979e232a3b9</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk.config">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:b5:55:3c"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <target dev="tap234381ea-07"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/console.log" append="off"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:51:59 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:51:59 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:51:59 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:51:59 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.392 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Preparing to wait for external event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.393 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.393 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.393 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.394 225859 DEBUG nova.virt.libvirt.vif [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:51:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-828759404',display_name='tempest-ServerStableDeviceRescueTest-server-828759404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-828759404',id=117,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-ztpzn050',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:51:55Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=23ea4537-f03f-46de-881f-b979e232a3b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.394 225859 DEBUG nova.network.os_vif_util [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.395 225859 DEBUG nova.network.os_vif_util [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.395 225859 DEBUG os_vif [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.396 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.397 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.397 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.404 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap234381ea-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.404 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap234381ea-07, col_values=(('external_ids', {'iface-id': '234381ea-07b1-41fe-b3c1-be97ce6a3b64', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:55:3c', 'vm-uuid': '23ea4537-f03f-46de-881f-b979e232a3b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.407 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:59 np0005588919 NetworkManager[49104]: <info>  [1768920719.4083] manager: (tap234381ea-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.416 225859 INFO os_vif [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07')#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.473 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.474 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.474 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:b5:55:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.474 225859 INFO nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Using config drive#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.500 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.581 225859 DEBUG nova.network.neutron [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updated VIF entry in instance network info cache for port 234381ea-07b1-41fe-b3c1-be97ce6a3b64. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.582 225859 DEBUG nova.network.neutron [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.605 225859 DEBUG oslo_concurrency.lockutils [req-7d5244c6-5f1c-44a0-8712-98cf17b16a1e req-411d5867-42e1-44dc-a564-3d5df9652432 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:51:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:51:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:51:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:51:59.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.912 225859 INFO nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Creating config drive at /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config#033[00m
Jan 20 09:51:59 np0005588919 nova_compute[225855]: 2026-01-20 14:51:59.916 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjxbz3kh0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.049 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjxbz3kh0" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.076 225859 DEBUG nova.storage.rbd_utils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.079 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.261 225859 DEBUG oslo_concurrency.processutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.262 225859 INFO nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Deleting local config drive /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config because it was imported into RBD.#033[00m
Jan 20 09:52:00 np0005588919 kernel: tap234381ea-07: entered promiscuous mode
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.3008] manager: (tap234381ea-07): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Jan 20 09:52:00 np0005588919 systemd-udevd[270395]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.361 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:00Z|00437|binding|INFO|Claiming lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 for this chassis.
Jan 20 09:52:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:00Z|00438|binding|INFO|234381ea-07b1-41fe-b3c1-be97ce6a3b64: Claiming fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.367 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.3727] device (tap234381ea-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.3746] device (tap234381ea-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.374 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.375 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.377 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.387 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34ac9ce1-e8be-40a7-a3a3-7aa9628213b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.388 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:52:00 np0005588919 systemd-machined[194361]: New machine qemu-52-instance-00000075.
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.390 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.390 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c049d9-6562-4414-80cf-d6cd64eafaa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.391 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[90874d05-c1b9-43cc-8a63-855a5c897673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.401 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[2d25b206-a839-45a6-bff0-e6af170d56a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 systemd[1]: Started Virtual Machine qemu-52-instance-00000075.
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.429 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e50bc71c-f0b9-4897-928c-981849774e5d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:00Z|00439|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 ovn-installed in OVS
Jan 20 09:52:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:00Z|00440|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 up in Southbound
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.444 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.463 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8a7b9d-0bc0-4135-a35e-3b88583a7d97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.4688] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.468 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[755ecd3e-d89a-4652-9793-ec88224d6ac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.498 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f71c5555-c884-4dd6-ae3c-8881d817a96e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.505 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b46c5ffb-4b9f-4f0a-b8e7-6edb3160305d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.5268] device (tap79184781-10): carrier: link connected
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.531 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[df82f266-e0e9-4eed-b071-a47743285f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.549 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b6be8f08-ed82-472d-b92f-ee18a70c20c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575945, 'reachable_time': 32082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270431, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.567 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a841b0f4-8c19-4c52-8b90-d5c07fc32e32]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575945, 'tstamp': 575945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270432, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.586 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b6447f6c-5be7-44c7-840d-ff608edca17a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575945, 'reachable_time': 32082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270433, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.620 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[907d3690-11de-4e4a-940a-9c349c394bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.684 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[74f13395-bc5a-4464-9b47-75f6930bfbbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.685 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.686 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.686 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.687 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 NetworkManager[49104]: <info>  [1768920720.6883] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 20 09:52:00 np0005588919 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.690 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.693 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.694 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:00Z|00441|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.698 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.699 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.700 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c4964c62-ef64-45d9-bc1a-b9404a7a8c9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.701 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:52:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:00.701 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.729 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:52:00 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:52:00 np0005588919 nova_compute[225855]: 2026-01-20 14:52:00.934 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:01.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:01 np0005588919 podman[270512]: 2026-01-20 14:52:01.121636866 +0000 UTC m=+0.052413831 container create 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 09:52:01 np0005588919 systemd[1]: Started libpod-conmon-60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341.scope.
Jan 20 09:52:01 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:52:01 np0005588919 podman[270512]: 2026-01-20 14:52:01.090080235 +0000 UTC m=+0.020857220 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:52:01 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4d48cd3dd5c0d4dc71eeb194f1e389fb38ec773128a339e2e0b3111348c8895/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:52:01 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 20 09:52:01 np0005588919 podman[270512]: 2026-01-20 14:52:01.201733987 +0000 UTC m=+0.132510982 container init 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 09:52:01 np0005588919 podman[270512]: 2026-01-20 14:52:01.207662805 +0000 UTC m=+0.138439770 container start 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:52:01 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [NOTICE]   (270531) : New worker (270533) forked
Jan 20 09:52:01 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [NOTICE]   (270531) : Loading success.
Jan 20 09:52:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:01.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.031 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920722.0307524, 23ea4537-f03f-46de-881f-b979e232a3b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.032 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Started (Lifecycle Event)#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.063 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.069 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920722.0310616, 23ea4537-f03f-46de-881f-b979e232a3b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.069 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.087 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.093 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.114 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:52:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.745 225859 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.746 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.746 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.747 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.747 225859 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Processing event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.747 225859 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.747 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.748 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.748 225859 DEBUG oslo_concurrency.lockutils [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.748 225859 DEBUG nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.748 225859 WARNING nova.compute.manager [req-b3bf3c1c-1fed-4f57-ba57-b10398f2bec4 req-b67be893-9ebb-4432-9147-ae90394bf5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.749 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.752 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920722.7522752, 23ea4537-f03f-46de-881f-b979e232a3b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.752 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.754 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.757 225859 INFO nova.virt.libvirt.driver [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance spawned successfully.#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.758 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.775 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.781 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.785 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.785 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.785 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.786 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.786 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.786 225859 DEBUG nova.virt.libvirt.driver [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.823 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.860 225859 INFO nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Took 7.78 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.860 225859 DEBUG nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:02 np0005588919 nova_compute[225855]: 2026-01-20 14:52:02.955 225859 INFO nova.compute.manager [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Took 8.77 seconds to build instance.#033[00m
Jan 20 09:52:03 np0005588919 nova_compute[225855]: 2026-01-20 14:52:03.019 225859 DEBUG oslo_concurrency.lockutils [None req-dfa82bcd-d307-4023-be3c-6e308041cbda d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:03.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:03.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:04 np0005588919 nova_compute[225855]: 2026-01-20 14:52:04.408 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:05.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:05 np0005588919 nova_compute[225855]: 2026-01-20 14:52:05.637 225859 DEBUG nova.compute.manager [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:05 np0005588919 nova_compute[225855]: 2026-01-20 14:52:05.689 225859 INFO nova.compute.manager [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] instance snapshotting#033[00m
Jan 20 09:52:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:05 np0005588919 nova_compute[225855]: 2026-01-20 14:52:05.939 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:05 np0005588919 nova_compute[225855]: 2026-01-20 14:52:05.954 225859 INFO nova.virt.libvirt.driver [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Beginning live snapshot process#033[00m
Jan 20 09:52:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:52:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2460387741' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:52:06 np0005588919 nova_compute[225855]: 2026-01-20 14:52:06.234 225859 DEBUG nova.virt.libvirt.imagebackend [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:52:06 np0005588919 nova_compute[225855]: 2026-01-20 14:52:06.511 225859 DEBUG nova.storage.rbd_utils [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(783df5f942d44e81a853b6c48ec72869) on rbd image(23ea4537-f03f-46de-881f-b979e232a3b9_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:52:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 20 09:52:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:07.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:07 np0005588919 nova_compute[225855]: 2026-01-20 14:52:07.092 225859 DEBUG nova.storage.rbd_utils [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk@783df5f942d44e81a853b6c48ec72869 to images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:52:07 np0005588919 nova_compute[225855]: 2026-01-20 14:52:07.191 225859 DEBUG nova.storage.rbd_utils [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] flattening images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:52:07 np0005588919 nova_compute[225855]: 2026-01-20 14:52:07.440 225859 DEBUG nova.storage.rbd_utils [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] removing snapshot(783df5f942d44e81a853b6c48ec72869) on rbd image(23ea4537-f03f-46de-881f-b979e232a3b9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:52:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:07.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 20 09:52:08 np0005588919 nova_compute[225855]: 2026-01-20 14:52:08.292 225859 DEBUG nova.storage.rbd_utils [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(snap) on rbd image(a1d1cbcb-c6a5-49c9-8868-06c3872a40d2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:52:09 np0005588919 podman[270729]: 2026-01-20 14:52:09.030826832 +0000 UTC m=+0.073400754 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:52:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:09.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 20 09:52:09 np0005588919 nova_compute[225855]: 2026-01-20 14:52:09.409 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:09.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:10 np0005588919 nova_compute[225855]: 2026-01-20 14:52:10.941 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:11.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:11.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:12 np0005588919 nova_compute[225855]: 2026-01-20 14:52:12.277 225859 INFO nova.virt.libvirt.driver [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Snapshot image upload complete#033[00m
Jan 20 09:52:12 np0005588919 nova_compute[225855]: 2026-01-20 14:52:12.278 225859 INFO nova.compute.manager [None req-98302ea4-660c-495b-a273-3923f15a0b09 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Took 6.59 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:52:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:12.810 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:12 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:12.811 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:52:12 np0005588919 nova_compute[225855]: 2026-01-20 14:52:12.812 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:13.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:13.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:14 np0005588919 nova_compute[225855]: 2026-01-20 14:52:14.411 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 20 09:52:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:15.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:15Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:15 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:15Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:15 np0005588919 nova_compute[225855]: 2026-01-20 14:52:15.941 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:16.411 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:16.412 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:16.412 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:17.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:17 np0005588919 nova_compute[225855]: 2026-01-20 14:52:17.175 225859 INFO nova.compute.manager [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Rescuing#033[00m
Jan 20 09:52:17 np0005588919 nova_compute[225855]: 2026-01-20 14:52:17.175 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:52:17 np0005588919 nova_compute[225855]: 2026-01-20 14:52:17.175 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:52:17 np0005588919 nova_compute[225855]: 2026-01-20 14:52:17.176 225859 DEBUG nova.network.neutron [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:52:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:17.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:19.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:19 np0005588919 nova_compute[225855]: 2026-01-20 14:52:19.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:19.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.325 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "932fd680-9aa0-49b4-9915-fa55104aaad7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.326 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.326 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "932fd680-9aa0-49b4-9915-fa55104aaad7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.326 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.327 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.328 225859 INFO nova.compute.manager [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Terminating instance#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.329 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "refresh_cache-932fd680-9aa0-49b4-9915-fa55104aaad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.329 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquired lock "refresh_cache-932fd680-9aa0-49b4-9915-fa55104aaad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.329 225859 DEBUG nova.network.neutron [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.800 225859 DEBUG nova.network.neutron [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:52:20 np0005588919 nova_compute[225855]: 2026-01-20 14:52:20.943 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:21 np0005588919 podman[270813]: 2026-01-20 14:52:21.008007108 +0000 UTC m=+0.050027874 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:52:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:21.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:21.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:22.813 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:23.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.104 225859 DEBUG nova.network.neutron [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.128 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Releasing lock "refresh_cache-932fd680-9aa0-49b4-9915-fa55104aaad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.128 225859 DEBUG nova.compute.manager [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:52:23 np0005588919 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 20 09:52:23 np0005588919 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Consumed 14.404s CPU time.
Jan 20 09:52:23 np0005588919 systemd-machined[194361]: Machine qemu-51-instance-00000072 terminated.
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.217 225859 DEBUG nova.network.neutron [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.243 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.352 225859 INFO nova.virt.libvirt.driver [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Instance destroyed successfully.#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.353 225859 DEBUG nova.objects.instance [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lazy-loading 'resources' on Instance uuid 932fd680-9aa0-49b4-9915-fa55104aaad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.777 225859 INFO nova.virt.libvirt.driver [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Deleting instance files /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7_del#033[00m
Jan 20 09:52:23 np0005588919 nova_compute[225855]: 2026-01-20 14:52:23.779 225859 INFO nova.virt.libvirt.driver [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Deletion of /var/lib/nova/instances/932fd680-9aa0-49b4-9915-fa55104aaad7_del complete#033[00m
Jan 20 09:52:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:23.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:24 np0005588919 nova_compute[225855]: 2026-01-20 14:52:24.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.069 225859 INFO nova.compute.manager [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Took 1.94 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.069 225859 DEBUG oslo.service.loopingcall [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.069 225859 DEBUG nova.compute.manager [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.070 225859 DEBUG nova.network.neutron [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:52:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:25.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.457 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:52:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:52:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:25.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:52:25 np0005588919 nova_compute[225855]: 2026-01-20 14:52:25.945 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:27.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.281 225859 DEBUG nova.network.neutron [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.368 225859 DEBUG nova.network.neutron [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.395 225859 INFO nova.compute.manager [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Took 2.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:52:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.501 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.502 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.682 225859 DEBUG oslo_concurrency.processutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:27 np0005588919 kernel: tap234381ea-07 (unregistering): left promiscuous mode
Jan 20 09:52:27 np0005588919 NetworkManager[49104]: <info>  [1768920747.7382] device (tap234381ea-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:52:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:27Z|00442|binding|INFO|Releasing lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 from this chassis (sb_readonly=0)
Jan 20 09:52:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:27Z|00443|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 down in Southbound
Jan 20 09:52:27 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:27Z|00444|binding|INFO|Removing iface tap234381ea-07 ovn-installed in OVS
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:27.767 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:27.768 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:52:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:27.770 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:52:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:27.774 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b36381-fdc2-4163-a2fe-88c01f0eef0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:27.774 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:52:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:52:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:52:27 np0005588919 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 20 09:52:27 np0005588919 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000075.scope: Consumed 15.258s CPU time.
Jan 20 09:52:27 np0005588919 systemd-machined[194361]: Machine qemu-52-instance-00000075 terminated.
Jan 20 09:52:27 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [NOTICE]   (270531) : haproxy version is 2.8.14-c23fe91
Jan 20 09:52:27 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [NOTICE]   (270531) : path to executable is /usr/sbin/haproxy
Jan 20 09:52:27 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [WARNING]  (270531) : Exiting Master process...
Jan 20 09:52:27 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [ALERT]    (270531) : Current worker (270533) exited with code 143 (Terminated)
Jan 20 09:52:27 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[270527]: [WARNING]  (270531) : All workers exited. Exiting... (0)
Jan 20 09:52:27 np0005588919 systemd[1]: libpod-60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341.scope: Deactivated successfully.
Jan 20 09:52:27 np0005588919 podman[270895]: 2026-01-20 14:52:27.903434554 +0000 UTC m=+0.046016110 container died 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:52:27 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341-userdata-shm.mount: Deactivated successfully.
Jan 20 09:52:27 np0005588919 systemd[1]: var-lib-containers-storage-overlay-f4d48cd3dd5c0d4dc71eeb194f1e389fb38ec773128a339e2e0b3111348c8895-merged.mount: Deactivated successfully.
Jan 20 09:52:27 np0005588919 podman[270895]: 2026-01-20 14:52:27.953451626 +0000 UTC m=+0.096033182 container cleanup 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 09:52:27 np0005588919 systemd[1]: libpod-conmon-60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341.scope: Deactivated successfully.
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.971 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:27 np0005588919 nova_compute[225855]: 2026-01-20 14:52:27.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:28 np0005588919 podman[270933]: 2026-01-20 14:52:28.024649376 +0000 UTC m=+0.048961643 container remove 60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.030 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b38290e0-0985-437a-b355-c65d6374e8a6]: (4, ('Tue Jan 20 02:52:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341)\n60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341\nTue Jan 20 02:52:27 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341)\n60e161a7f3185d9e79dabaf678e5dc136ea2a7d8acb23d5437f931e610606341\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.032 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfa018f-b3fe-479b-9da2-768e68d59303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.032 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.034 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:28 np0005588919 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.054 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2ef3c2-0a29-437c-ad7a-a9bbf08b3306]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.080 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ab088bd7-e269-4b1a-8e50-76d42b0a0b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.081 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3fa512-2402-4a93-8d11-47a96e49ec9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.099 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f071c2d6-1622-4378-b342-1be660670dc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575938, 'reachable_time': 37445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270961, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.102 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:52:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:28.102 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[04119278-7ad5-4eca-8825-7b7f4cc7172d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3273996106' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.158 225859 DEBUG oslo_concurrency.processutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.163 225859 DEBUG nova.compute.provider_tree [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.476 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.483 225859 INFO nova.virt.libvirt.driver [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance destroyed successfully.#033[00m
Jan 20 09:52:28 np0005588919 nova_compute[225855]: 2026-01-20 14:52:28.484 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:29.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.718 225859 DEBUG nova.scheduler.client.report [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.726 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Attempting a stable device rescue#033[00m
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.744 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:29.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.793 225859 INFO nova.scheduler.client.report [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Deleted allocations for instance 932fd680-9aa0-49b4-9915-fa55104aaad7#033[00m
Jan 20 09:52:29 np0005588919 nova_compute[225855]: 2026-01-20 14:52:29.884 225859 DEBUG oslo_concurrency.lockutils [None req-f87f7e56-fd46-4e2f-ae00-24f38def3109 cdcdce94e7354b3bafb34285408888b9 ecfc3366b9194864a3f15ce0114b5ee3 - - default default] Lock "932fd680-9aa0-49b4-9915-fa55104aaad7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.097 225859 DEBUG nova.compute.manager [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.098 225859 DEBUG oslo_concurrency.lockutils [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.098 225859 DEBUG oslo_concurrency.lockutils [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.098 225859 DEBUG oslo_concurrency.lockutils [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.099 225859 DEBUG nova.compute.manager [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.099 225859 WARNING nova.compute.manager [req-0b3af9a4-1892-417e-be6d-926cde5a5e9d req-453b1863-f94d-4eb7-847f-67fb8523eed7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.243 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.248 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.248 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Creating image(s)#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.275 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.278 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:30 np0005588919 nova_compute[225855]: 2026-01-20 14:52:30.947 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:31.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:31 np0005588919 nova_compute[225855]: 2026-01-20 14:52:31.266 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:31 np0005588919 nova_compute[225855]: 2026-01-20 14:52:31.295 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:31 np0005588919 nova_compute[225855]: 2026-01-20 14:52:31.298 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "a92690a6b5c5730c73a0f5ee421c950863bba099" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:31 np0005588919 nova_compute[225855]: 2026-01-20 14:52:31.299 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "a92690a6b5c5730c73a0f5ee421c950863bba099" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:31.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.086 225859 DEBUG nova.virt.libvirt.imagebackend [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.152 225859 DEBUG nova.virt.libvirt.imagebackend [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.153 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning images/a1d1cbcb-c6a5-49c9-8868-06c3872a40d2@snap to None/23ea4537-f03f-46de-881f-b979e232a3b9_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.234 225859 DEBUG nova.compute.manager [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.235 225859 DEBUG oslo_concurrency.lockutils [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.235 225859 DEBUG oslo_concurrency.lockutils [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.235 225859 DEBUG oslo_concurrency.lockutils [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.236 225859 DEBUG nova.compute.manager [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.236 225859 WARNING nova.compute.manager [req-6a91aa19-29fb-49c0-bb7a-224473e4046b req-5f0216ab-efc3-436e-9828-0734421de22e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.247 225859 DEBUG oslo_concurrency.lockutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "a92690a6b5c5730c73a0f5ee421c950863bba099" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.291 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.306 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.308 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start _get_guest_xml network_info=[{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:b5:55:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a1d1cbcb-c6a5-49c9-8868-06c3872a40d2', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.308 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.326 225859 WARNING nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.330 225859 DEBUG nova.virt.libvirt.host [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.331 225859 DEBUG nova.virt.libvirt.host [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.333 225859 DEBUG nova.virt.libvirt.host [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.334 225859 DEBUG nova.virt.libvirt.host [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.334 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.335 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.335 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.335 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.335 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.336 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.336 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.336 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.336 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.337 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.337 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.337 225859 DEBUG nova.virt.hardware [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.337 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.357 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:52:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/861012501' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.806 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:32 np0005588919 nova_compute[225855]: 2026-01-20 14:52:32.854 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:52:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:33.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:52:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:52:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3561049918' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.294 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.297 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:52:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2828053923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.716 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.719 225859 DEBUG nova.virt.libvirt.vif [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-828759404',display_name='tempest-ServerStableDeviceRescueTest-server-828759404',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-828759404',id=117,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:52:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-ztpzn050',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:52:12Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=23ea4537-f03f-46de-881f-b979e232a3b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:b5:55:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.719 225859 DEBUG nova.network.os_vif_util [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:b5:55:3c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.720 225859 DEBUG nova.network.os_vif_util [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.722 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.758 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <uuid>23ea4537-f03f-46de-881f-b979e232a3b9</uuid>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <name>instance-00000075</name>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-828759404</nova:name>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:52:32</nova:creationTime>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <nova:port uuid="234381ea-07b1-41fe-b3c1-be97ce6a3b64">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="serial">23ea4537-f03f-46de-881f-b979e232a3b9</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="uuid">23ea4537-f03f-46de-881f-b979e232a3b9</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk.config">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/23ea4537-f03f-46de-881f-b979e232a3b9_disk.rescue">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <boot order="1"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:b5:55:3c"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <target dev="tap234381ea-07"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/console.log" append="off"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:52:33 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:52:33 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:52:33 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:52:33 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.772 225859 INFO nova.virt.libvirt.driver [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance destroyed successfully.#033[00m
Jan 20 09:52:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:33.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.848 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.849 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.849 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.850 225859 DEBUG nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:b5:55:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.851 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Using config drive#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.893 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:33 np0005588919 nova_compute[225855]: 2026-01-20 14:52:33.983 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:34 np0005588919 nova_compute[225855]: 2026-01-20 14:52:34.020 225859 DEBUG nova.objects.instance [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'keypairs' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:34 np0005588919 nova_compute[225855]: 2026-01-20 14:52:34.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:34 np0005588919 nova_compute[225855]: 2026-01-20 14:52:34.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:35.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.187 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Creating config drive at /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.195 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkwe5xshx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.594 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.595 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.595 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.598 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkwe5xshx" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.637 225859 DEBUG nova.storage.rbd_utils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.640 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:35.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.810 225859 DEBUG oslo_concurrency.processutils [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue 23ea4537-f03f-46de-881f-b979e232a3b9_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.812 225859 INFO nova.virt.libvirt.driver [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Deleting local config drive /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.817 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.818 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.818 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.818 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:35 np0005588919 kernel: tap234381ea-07: entered promiscuous mode
Jan 20 09:52:35 np0005588919 NetworkManager[49104]: <info>  [1768920755.8822] manager: (tap234381ea-07): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.882 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:35Z|00445|binding|INFO|Claiming lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 for this chassis.
Jan 20 09:52:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:35Z|00446|binding|INFO|234381ea-07b1-41fe-b3c1-be97ce6a3b64: Claiming fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:35Z|00447|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 ovn-installed in OVS
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.901 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.904 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:35 np0005588919 systemd-udevd[271313]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:52:35 np0005588919 systemd-machined[194361]: New machine qemu-53-instance-00000075.
Jan 20 09:52:35 np0005588919 NetworkManager[49104]: <info>  [1768920755.9217] device (tap234381ea-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:52:35 np0005588919 NetworkManager[49104]: <info>  [1768920755.9223] device (tap234381ea-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:52:35 np0005588919 systemd[1]: Started Virtual Machine qemu-53-instance-00000075.
Jan 20 09:52:35 np0005588919 nova_compute[225855]: 2026-01-20 14:52:35.948 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.971 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:35 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:35Z|00448|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 up in Southbound
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.972 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.973 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.984 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[26eac545-e443-4b63-a110-111b9f0991f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.985 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.987 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.987 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a013a5f2-0907-4cd1-85fb-08562a7b2051]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:35.988 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c67c91b7-4d4c-4c10-b6d8-3f49adacd3cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.007 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff99268-c7e8-46a3-81da-3161d9a871b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c26a7436-1922-421c-9ab3-e27006e584a3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.049 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd2fce2-22d3-48b7-9feb-901a41f2d20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.053 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c83666cc-462c-4fad-92b7-8171804b7c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 NetworkManager[49104]: <info>  [1768920756.0543] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.095 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6eab4527-51c1-46e3-8287-5f396f6dca1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.098 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[614466c6-d02c-4b1a-b4f1-15775b0c026e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 NetworkManager[49104]: <info>  [1768920756.1298] device (tap79184781-10): carrier: link connected
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.135 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cce8b75d-004f-4faf-861c-d38a3c1af98c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.158 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c113a54c-8cfd-4115-aa83-647275da955d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579505, 'reachable_time': 30090, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271348, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.176 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[655cec9d-6b64-43b4-966f-691cc71f4099]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 579505, 'tstamp': 579505}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271349, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.194 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0ef578-8502-4c4f-befc-a3b92a2fd52d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579505, 'reachable_time': 30090, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271350, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.228 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4470f9-9174-4d99-8d00-0269aa13123c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.305 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e094836a-e2b7-4474-acf4-1c41a78f75e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.307 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.307 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.308 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588919 NetworkManager[49104]: <info>  [1768920756.3104] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 20 09:52:36 np0005588919 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.314 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.314 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:36Z|00449|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.348 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.349 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.350 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.351 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1adae803-555d-4d16-8b4c-07ec44a7f03c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.352 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:52:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:36.353 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.540 225859 DEBUG nova.compute.manager [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.541 225859 DEBUG oslo_concurrency.lockutils [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.541 225859 DEBUG oslo_concurrency.lockutils [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.541 225859 DEBUG oslo_concurrency.lockutils [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.542 225859 DEBUG nova.compute.manager [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.542 225859 WARNING nova.compute.manager [req-a5be19e0-0fad-443f-96a6-efb5a4aa0cef req-64bc2af3-95da-40e9-ad1d-dcdc8d49031e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.664 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 23ea4537-f03f-46de-881f-b979e232a3b9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.665 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920756.663731, 23ea4537-f03f-46de-881f-b979e232a3b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.665 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.669 225859 DEBUG nova.compute.manager [None req-863cf6e0-1eda-4be3-ad66-8b6c9d7954d5 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.693 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.696 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:36 np0005588919 podman[271439]: 2026-01-20 14:52:36.717527275 +0000 UTC m=+0.060328955 container create b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:52:36 np0005588919 systemd[1]: Started libpod-conmon-b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef.scope.
Jan 20 09:52:36 np0005588919 podman[271439]: 2026-01-20 14:52:36.689182224 +0000 UTC m=+0.031983914 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:52:36 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:52:36 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ecd8403a0105356438fdeb735c64fc58258b386b8c6a85324ab7dbc102be6b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:52:36 np0005588919 podman[271439]: 2026-01-20 14:52:36.808924895 +0000 UTC m=+0.151726585 container init b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 09:52:36 np0005588919 podman[271439]: 2026-01-20 14:52:36.819055631 +0000 UTC m=+0.161857301 container start b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:52:36 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [NOTICE]   (271456) : New worker (271458) forked
Jan 20 09:52:36 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [NOTICE]   (271456) : Loading success.
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.901 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.902 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920756.664075, 23ea4537-f03f-46de-881f-b979e232a3b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.902 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Started (Lifecycle Event)#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.945 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:36 np0005588919 nova_compute[225855]: 2026-01-20 14:52:36.950 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:37.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 20 09:52:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:37.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.351 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920743.3496437, 932fd680-9aa0-49b4-9915-fa55104aaad7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.352 225859 INFO nova.compute.manager [-] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.682 225859 DEBUG nova.compute.manager [None req-95f49cd8-9434-4d4c-904b-f8bc42496fcc - - - - - -] [instance: 932fd680-9aa0-49b4-9915-fa55104aaad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.713 225859 DEBUG nova.compute.manager [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.714 225859 DEBUG oslo_concurrency.lockutils [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.715 225859 DEBUG oslo_concurrency.lockutils [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.715 225859 DEBUG oslo_concurrency.lockutils [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.716 225859 DEBUG nova.compute.manager [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.716 225859 WARNING nova.compute.manager [req-f80bafd7-5204-4400-b3e5-a924f48c3943 req-3f494ac1-9172-4ae8-9ee4-7bedd9f7afb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:52:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.749 225859 INFO nova.compute.manager [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Unrescuing#033[00m
Jan 20 09:52:38 np0005588919 nova_compute[225855]: 2026-01-20 14:52:38.750 225859 DEBUG oslo_concurrency.lockutils [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:52:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:39.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:39 np0005588919 nova_compute[225855]: 2026-01-20 14:52:39.127 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:39 np0005588919 nova_compute[225855]: 2026-01-20 14:52:39.422 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 20 09:52:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:39.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:40 np0005588919 podman[271469]: 2026-01-20 14:52:40.074062393 +0000 UTC m=+0.099661784 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.597 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.597 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.598 225859 DEBUG oslo_concurrency.lockutils [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.598 225859 DEBUG nova.network.neutron [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.599 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.600 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.600 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:52:40 np0005588919 nova_compute[225855]: 2026-01-20 14:52:40.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:41.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.402 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.403 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.404 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.404 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.405 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:41.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/383931634' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.859 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.939 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.939 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:52:41 np0005588919 nova_compute[225855]: 2026-01-20 14:52:41.940 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.095 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.097 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4219MB free_disk=20.806049346923828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.097 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.098 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.698 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 23ea4537-f03f-46de-881f-b979e232a3b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.698 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.698 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:52:42 np0005588919 nova_compute[225855]: 2026-01-20 14:52:42.753 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:52:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:43.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:52:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2181614927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:52:43 np0005588919 nova_compute[225855]: 2026-01-20 14:52:43.248 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:52:43 np0005588919 nova_compute[225855]: 2026-01-20 14:52:43.256 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:52:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:43 np0005588919 nova_compute[225855]: 2026-01-20 14:52:43.851 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.210 225859 DEBUG nova.network.neutron [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.248 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.248 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.348 225859 DEBUG oslo_concurrency.lockutils [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.350 225859 DEBUG nova.objects.instance [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.472 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 kernel: tap234381ea-07 (unregistering): left promiscuous mode
Jan 20 09:52:44 np0005588919 NetworkManager[49104]: <info>  [1768920764.5318] device (tap234381ea-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00450|binding|INFO|Releasing lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 from this chassis (sb_readonly=0)
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00451|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 down in Southbound
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00452|binding|INFO|Removing iface tap234381ea-07 ovn-installed in OVS
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.537 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.540 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.570 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 20 09:52:44 np0005588919 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000075.scope: Consumed 8.917s CPU time.
Jan 20 09:52:44 np0005588919 systemd-machined[194361]: Machine qemu-53-instance-00000075 terminated.
Jan 20 09:52:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.657 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.659 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.660 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.661 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5e3e18-ea31-4021-ab25-ef3395d451da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.662 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.714 225859 INFO nova.virt.libvirt.driver [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance destroyed successfully.#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.714 225859 DEBUG nova.objects.instance [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:52:44 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [NOTICE]   (271456) : haproxy version is 2.8.14-c23fe91
Jan 20 09:52:44 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [NOTICE]   (271456) : path to executable is /usr/sbin/haproxy
Jan 20 09:52:44 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [WARNING]  (271456) : Exiting Master process...
Jan 20 09:52:44 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [ALERT]    (271456) : Current worker (271458) exited with code 143 (Terminated)
Jan 20 09:52:44 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271452]: [WARNING]  (271456) : All workers exited. Exiting... (0)
Jan 20 09:52:44 np0005588919 systemd[1]: libpod-b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef.scope: Deactivated successfully.
Jan 20 09:52:44 np0005588919 podman[271578]: 2026-01-20 14:52:44.797113681 +0000 UTC m=+0.043756046 container died b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:52:44 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef-userdata-shm.mount: Deactivated successfully.
Jan 20 09:52:44 np0005588919 systemd[1]: var-lib-containers-storage-overlay-7ecd8403a0105356438fdeb735c64fc58258b386b8c6a85324ab7dbc102be6b3-merged.mount: Deactivated successfully.
Jan 20 09:52:44 np0005588919 podman[271578]: 2026-01-20 14:52:44.830289907 +0000 UTC m=+0.076932272 container cleanup b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:52:44 np0005588919 systemd[1]: libpod-conmon-b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef.scope: Deactivated successfully.
Jan 20 09:52:44 np0005588919 kernel: tap234381ea-07: entered promiscuous mode
Jan 20 09:52:44 np0005588919 NetworkManager[49104]: <info>  [1768920764.8750] manager: (tap234381ea-07): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00453|binding|INFO|Claiming lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 for this chassis.
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00454|binding|INFO|234381ea-07b1-41fe-b3c1-be97ce6a3b64: Claiming fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:44 np0005588919 systemd-udevd[271547]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.878 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 NetworkManager[49104]: <info>  [1768920764.8879] device (tap234381ea-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:52:44 np0005588919 NetworkManager[49104]: <info>  [1768920764.8883] device (tap234381ea-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.891 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00455|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 ovn-installed in OVS
Jan 20 09:52:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:44Z|00456|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 up in Southbound
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.893 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.896 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 podman[271611]: 2026-01-20 14:52:44.904890353 +0000 UTC m=+0.052590225 container remove b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:52:44 np0005588919 systemd-machined[194361]: New machine qemu-54-instance-00000075.
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.910 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac255d3-77fe-4128-9c53-93af5b3fcb66]: (4, ('Tue Jan 20 02:52:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef)\nb9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef\nTue Jan 20 02:52:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (b9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef)\nb9869b2334cb22ebb41a8e59d6adfb7d39a9957a4d3680921081b2143268cfef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.912 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[16ecedec-4979-48f5-8d08-4b9774cd0b2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.913 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.914 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:52:44 np0005588919 systemd[1]: Started Virtual Machine qemu-54-instance-00000075.
Jan 20 09:52:44 np0005588919 nova_compute[225855]: 2026-01-20 14:52:44.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.931 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5cac4136-fabf-4b93-8bc3-3cd90596423d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.946 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd31fe1-0039-4317-8787-adc08aae5f8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.947 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e82fd6dd-3a57-4313-8732-ee956b93479d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.963 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[98a05aab-0ad0-4ba3-b6b5-42a00b2ae4db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579497, 'reachable_time': 34960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271642, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.968 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.968 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3f0333-69d2-4179-a5d1-80859c9acd18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.969 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.970 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.981 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ced326-9b9c-438b-b0c4-cc2c7413ef02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.982 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap79184781-11 in ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.984 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap79184781-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.984 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[087020f5-724d-44e8-a327-e4eb0ca7da24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.985 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[36329abc-127a-4215-ba02-599e2ae86b6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:44.998 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[7af3aa7b-054a-445f-bdc8-37eca34673c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd6c477-aa13-4588-b0bd-9ae9cabd34f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.047 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7c33c1b3-8840-4d61-9db7-90bc97b8a097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 NetworkManager[49104]: <info>  [1768920765.0591] manager: (tap79184781-10): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.057 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef6aa22-ccef-419e-a0ea-59461d5c3c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.102 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f150d79d-8dc2-407c-93bc-0de6c0095adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.104 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[545c8857-c176-457f-a76a-b12c01e0312f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:45.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:45 np0005588919 NetworkManager[49104]: <info>  [1768920765.1280] device (tap79184781-10): carrier: link connected
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.131 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f9dd0193-67f2-49d9-ac45-388f11b86169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.147 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe19de1-e704-43fe-ace7-aa652aef860b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271667, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.159 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6b26d786-636c-4574-8006-ec2dbf127a79]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:7c2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580405, 'tstamp': 580405}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271668, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.174 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c6017c9e-90c1-49c0-a46c-5dcac2966874]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271669, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.201 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78ed86b0-74b1-400e-b4bd-5f01aaee34d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.249 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf70d61-5590-42b7-8805-612fc3324458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.250 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.251 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.251 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 NetworkManager[49104]: <info>  [1768920765.2541] manager: (tap79184781-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 20 09:52:45 np0005588919 kernel: tap79184781-10: entered promiscuous mode
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.256 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.260 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.261 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:45Z|00457|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.266 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.270 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6564ad-06e5-4445-b76d-7b6c6ab73380]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.271 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/79184781-1f23-4584-87de-08e262242488.pid.haproxy
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 79184781-1f23-4584-87de-08e262242488
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:52:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:52:45.272 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'env', 'PROCESS_TAG=haproxy-79184781-1f23-4584-87de-08e262242488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/79184781-1f23-4584-87de-08e262242488.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.340 225859 DEBUG nova.compute.manager [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.341 225859 DEBUG oslo_concurrency.lockutils [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.341 225859 DEBUG oslo_concurrency.lockutils [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.342 225859 DEBUG oslo_concurrency.lockutils [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.343 225859 DEBUG nova.compute.manager [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.344 225859 WARNING nova.compute.manager [req-c21db4e3-f0da-4001-a396-94bfa961cef0 req-aaa19aa8-e747-468f-b852-1e15d1a5ace3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.574 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 23ea4537-f03f-46de-881f-b979e232a3b9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.576 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920765.5737062, 23ea4537-f03f-46de-881f-b979e232a3b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.576 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:52:45 np0005588919 podman[271762]: 2026-01-20 14:52:45.704271011 +0000 UTC m=+0.054608263 container create 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.714 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.717 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:45 np0005588919 systemd[1]: Started libpod-conmon-8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f.scope.
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.743 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.744 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920765.5749042, 23ea4537-f03f-46de-881f-b979e232a3b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.744 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Started (Lifecycle Event)#033[00m
Jan 20 09:52:45 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:52:45 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31b783ff1c4e23d189e5eb541d582f8638c64e811963a516b7ed73c8d8eafb4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:52:45 np0005588919 podman[271762]: 2026-01-20 14:52:45.679637036 +0000 UTC m=+0.029974298 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:52:45 np0005588919 podman[271762]: 2026-01-20 14:52:45.782991234 +0000 UTC m=+0.133328516 container init 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:52:45 np0005588919 podman[271762]: 2026-01-20 14:52:45.788660254 +0000 UTC m=+0.138997506 container start 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 09:52:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:45.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:45 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [NOTICE]   (271782) : New worker (271784) forked
Jan 20 09:52:45 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [NOTICE]   (271782) : Loading success.
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.886 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.891 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.952 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.972 225859 DEBUG nova.compute.manager [None req-042e7a60-6768-4229-8936-c6493f807f55 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:52:45 np0005588919 nova_compute[225855]: 2026-01-20 14:52:45.974 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:52:46 np0005588919 nova_compute[225855]: 2026-01-20 14:52:46.250 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:46 np0005588919 nova_compute[225855]: 2026-01-20 14:52:46.251 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:46 np0005588919 nova_compute[225855]: 2026-01-20 14:52:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:47.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:47.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.877 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.878 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.878 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.878 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.878 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.879 225859 WARNING nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.879 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.879 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.879 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.879 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.880 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.880 225859 WARNING nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.880 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.880 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.880 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.881 225859 DEBUG oslo_concurrency.lockutils [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.881 225859 DEBUG nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:52:47 np0005588919 nova_compute[225855]: 2026-01-20 14:52:47.881 225859 WARNING nova.compute.manager [req-26785ab1-815a-413e-92c5-3cf6d3d4c841 req-28c7d92a-22ef-49d9-be30-901d640262d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:52:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:49.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:49 np0005588919 nova_compute[225855]: 2026-01-20 14:52:49.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:49.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 20 09:52:50 np0005588919 nova_compute[225855]: 2026-01-20 14:52:50.959 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 20 09:52:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:51.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:51 np0005588919 podman[271819]: 2026-01-20 14:52:51.171882437 +0000 UTC m=+0.061006583 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:52:51 np0005588919 nova_compute[225855]: 2026-01-20 14:52:51.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:52:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:51.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 20 09:52:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:53.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:52:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:52:54 np0005588919 nova_compute[225855]: 2026-01-20 14:52:54.477 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:55.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:55.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:55 np0005588919 nova_compute[225855]: 2026-01-20 14:52:55.963 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:57.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:52:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:57.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:52:57Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:55:3c 10.100.0.14
Jan 20 09:52:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:52:59.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:52:59 np0005588919 nova_compute[225855]: 2026-01-20 14:52:59.510 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:52:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 20 09:52:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:52:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:52:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:52:59.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.556 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.557 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.588 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:53:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.801 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.801 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.809 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.809 225859 INFO nova.compute.claims [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:53:00 np0005588919 nova_compute[225855]: 2026-01-20 14:53:00.965 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:01 np0005588919 nova_compute[225855]: 2026-01-20 14:53:01.119 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4058498692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:01 np0005588919 nova_compute[225855]: 2026-01-20 14:53:01.557 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:01 np0005588919 nova_compute[225855]: 2026-01-20 14:53:01.565 225859 DEBUG nova.compute.provider_tree [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 20 09:53:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:01.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 20 09:53:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:53:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:53:02 np0005588919 nova_compute[225855]: 2026-01-20 14:53:02.766 225859 DEBUG nova.scheduler.client.report [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:02 np0005588919 nova_compute[225855]: 2026-01-20 14:53:02.893 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:02 np0005588919 nova_compute[225855]: 2026-01-20 14:53:02.894 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.049 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.049 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.107 225859 INFO nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:53:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:03.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.141 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.336 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.338 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.339 225859 INFO nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Creating image(s)#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.372 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.400 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.427 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.432 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.467 225859 DEBUG nova.policy [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd85d286ce6224326a0f4a15a06afbfea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.525 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.526 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.526 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.527 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.558 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.562 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 baada610-f563-4c97-89a9-56eba792c352_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:03.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.862 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 baada610-f563-4c97-89a9-56eba792c352_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.299s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:03 np0005588919 nova_compute[225855]: 2026-01-20 14:53:03.936 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] resizing rbd image baada610-f563-4c97-89a9-56eba792c352_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.055 225859 DEBUG nova.objects.instance [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.071 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.071 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Ensure instance console log exists: /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.071 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.072 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.072 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.513 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:04 np0005588919 nova_compute[225855]: 2026-01-20 14:53:04.891 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Successfully created port: a3156414-5a96-462d-974e-a57c9cd8e9c8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:53:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:05 np0005588919 nova_compute[225855]: 2026-01-20 14:53:05.958 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Successfully updated port: a3156414-5a96-462d-974e-a57c9cd8e9c8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:53:05 np0005588919 nova_compute[225855]: 2026-01-20 14:53:05.968 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:05 np0005588919 nova_compute[225855]: 2026-01-20 14:53:05.984 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:05 np0005588919 nova_compute[225855]: 2026-01-20 14:53:05.984 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:05 np0005588919 nova_compute[225855]: 2026-01-20 14:53:05.985 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:06 np0005588919 nova_compute[225855]: 2026-01-20 14:53:06.195 225859 DEBUG nova.compute.manager [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:06 np0005588919 nova_compute[225855]: 2026-01-20 14:53:06.195 225859 DEBUG nova.compute.manager [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing instance network info cache due to event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:06 np0005588919 nova_compute[225855]: 2026-01-20 14:53:06.195 225859 DEBUG oslo_concurrency.lockutils [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:06 np0005588919 nova_compute[225855]: 2026-01-20 14:53:06.837 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:53:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 20 09:53:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:07.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:07.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.112 225859 DEBUG nova.network.neutron [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.174 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.174 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance network_info: |[{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.174 225859 DEBUG oslo_concurrency.lockutils [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.175 225859 DEBUG nova.network.neutron [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.177 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start _get_guest_xml network_info=[{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.182 225859 WARNING nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.204 225859 DEBUG nova.virt.libvirt.host [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.205 225859 DEBUG nova.virt.libvirt.host [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.211 225859 DEBUG nova.virt.libvirt.host [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.212 225859 DEBUG nova.virt.libvirt.host [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.213 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.213 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.214 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.214 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.214 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.214 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.215 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.215 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.215 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.215 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.216 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.216 225859 DEBUG nova.virt.hardware [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.218 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2491785817' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.698 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.725 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:08 np0005588919 nova_compute[225855]: 2026-01-20 14:53:08.728 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:09 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:53:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:09.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3601927553' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.184 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.185 225859 DEBUG nova.virt.libvirt.vif [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-185388239',display_name='tempest-ServerStableDeviceRescueTest-server-185388239',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-185388239',id=119,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIfbHk5SMnBEAUVZEhoLPfdB2qCay341zK720hYW5qflxdgcEr+fHp9C3kAgJFmqON8wn8DkPxW0WmihyCLPTK7Iiiy5VDiRJ7U/0O7hlyzm17ZWhCVdPfXSugKxmeVL3w==',key_name='tempest-keypair-647310408',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-d9074tfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=baada610-f563-4c97-89a9-56eba792c352,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.186 225859 DEBUG nova.network.os_vif_util [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.186 225859 DEBUG nova.network.os_vif_util [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.188 225859 DEBUG nova.objects.instance [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.415 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <uuid>baada610-f563-4c97-89a9-56eba792c352</uuid>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <name>instance-00000077</name>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-185388239</nova:name>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:53:08</nova:creationTime>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <nova:port uuid="a3156414-5a96-462d-974e-a57c9cd8e9c8">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="serial">baada610-f563-4c97-89a9-56eba792c352</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="uuid">baada610-f563-4c97-89a9-56eba792c352</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/baada610-f563-4c97-89a9-56eba792c352_disk">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/baada610-f563-4c97-89a9-56eba792c352_disk.config">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:9e:93:82"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <target dev="tapa3156414-5a"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/console.log" append="off"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:53:09 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:53:09 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:53:09 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:53:09 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.416 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Preparing to wait for external event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.417 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.417 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.417 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.418 225859 DEBUG nova.virt.libvirt.vif [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-185388239',display_name='tempest-ServerStableDeviceRescueTest-server-185388239',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-185388239',id=119,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIfbHk5SMnBEAUVZEhoLPfdB2qCay341zK720hYW5qflxdgcEr+fHp9C3kAgJFmqON8wn8DkPxW0WmihyCLPTK7Iiiy5VDiRJ7U/0O7hlyzm17ZWhCVdPfXSugKxmeVL3w==',key_name='tempest-keypair-647310408',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-d9074tfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=baada610-f563-4c97-89a9-56eba792c352,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.419 225859 DEBUG nova.network.os_vif_util [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.419 225859 DEBUG nova.network.os_vif_util [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.420 225859 DEBUG os_vif [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.421 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.421 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.426 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.426 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3156414-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.427 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3156414-5a, col_values=(('external_ids', {'iface-id': 'a3156414-5a96-462d-974e-a57c9cd8e9c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:93:82', 'vm-uuid': 'baada610-f563-4c97-89a9-56eba792c352'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.428 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:09 np0005588919 NetworkManager[49104]: <info>  [1768920789.4295] manager: (tapa3156414-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.433 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.437 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.438 225859 INFO os_vif [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a')#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.500 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.500 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.500 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:9e:93:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.501 225859 INFO nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Using config drive#033[00m
Jan 20 09:53:09 np0005588919 nova_compute[225855]: 2026-01-20 14:53:09.532 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 20 09:53:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.174 225859 INFO nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Creating config drive at /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.181 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzd2gtkh0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.319 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzd2gtkh0" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.347 225859 DEBUG nova.storage.rbd_utils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.350 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config baada610-f563-4c97-89a9-56eba792c352_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.533 225859 DEBUG oslo_concurrency.processutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config baada610-f563-4c97-89a9-56eba792c352_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.535 225859 INFO nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Deleting local config drive /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config because it was imported into RBD.#033[00m
Jan 20 09:53:10 np0005588919 NetworkManager[49104]: <info>  [1768920790.5776] manager: (tapa3156414-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Jan 20 09:53:10 np0005588919 kernel: tapa3156414-5a: entered promiscuous mode
Jan 20 09:53:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:10Z|00458|binding|INFO|Claiming lport a3156414-5a96-462d-974e-a57c9cd8e9c8 for this chassis.
Jan 20 09:53:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:10Z|00459|binding|INFO|a3156414-5a96-462d-974e-a57c9cd8e9c8: Claiming fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.600 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.602 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.604 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:10 np0005588919 systemd-machined[194361]: New machine qemu-55-instance-00000077.
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.619 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c167eb3f-6dd9-4bb7-9155-fba8f9f9a059]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:10Z|00460|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 up in Southbound
Jan 20 09:53:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:10Z|00461|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 ovn-installed in OVS
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.625 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:10 np0005588919 systemd[1]: Started Virtual Machine qemu-55-instance-00000077.
Jan 20 09:53:10 np0005588919 systemd-udevd[272395]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.651 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[511be42f-671e-45d3-b01f-3ec34d1d31da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.654 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5972c449-3dd9-4e4f-9e99-59479d97e9de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 NetworkManager[49104]: <info>  [1768920790.6618] device (tapa3156414-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:53:10 np0005588919 NetworkManager[49104]: <info>  [1768920790.6633] device (tapa3156414-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.684 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1ac4bf-a621-49de-a58d-8431931e0ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.700 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed81a70-9d8d-4d7e-8b96-98b87b4bf2cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272419, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.713 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[33e94a7e-0ec3-4a7b-8df5-1984b2024af3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272423, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272423, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.715 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.717 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.717 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.717 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.718 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:10.718 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:10 np0005588919 podman[272379]: 2026-01-20 14:53:10.726962298 +0000 UTC m=+0.118605509 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:53:10 np0005588919 nova_compute[225855]: 2026-01-20 14:53:10.970 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.002 225859 DEBUG nova.network.neutron [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updated VIF entry in instance network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.002 225859 DEBUG nova.network.neutron [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.006 225859 DEBUG nova.compute.manager [req-a4198ce5-ec65-448c-9b74-c76d44c044d7 req-b2d1a642-60cd-46f7-8b31-2b6b4a013218 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.006 225859 DEBUG oslo_concurrency.lockutils [req-a4198ce5-ec65-448c-9b74-c76d44c044d7 req-b2d1a642-60cd-46f7-8b31-2b6b4a013218 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.007 225859 DEBUG oslo_concurrency.lockutils [req-a4198ce5-ec65-448c-9b74-c76d44c044d7 req-b2d1a642-60cd-46f7-8b31-2b6b4a013218 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.007 225859 DEBUG oslo_concurrency.lockutils [req-a4198ce5-ec65-448c-9b74-c76d44c044d7 req-b2d1a642-60cd-46f7-8b31-2b6b4a013218 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.007 225859 DEBUG nova.compute.manager [req-a4198ce5-ec65-448c-9b74-c76d44c044d7 req-b2d1a642-60cd-46f7-8b31-2b6b4a013218 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Processing event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.038 225859 DEBUG oslo_concurrency.lockutils [req-0b7503d7-3150-4015-9248-34691eb4bb30 req-43095be4-e873-4bf7-aad7-00d41ef180df 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.068 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920791.0682774, baada610-f563-4c97-89a9-56eba792c352 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.069 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Started (Lifecycle Event)#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.073 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.077 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.080 225859 INFO nova.virt.libvirt.driver [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance spawned successfully.#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.080 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.143 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:11.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.148 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.148 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.149 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.149 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.150 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.150 225859 DEBUG nova.virt.libvirt.driver [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.154 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.294 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.294 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920791.0697346, baada610-f563-4c97-89a9-56eba792c352 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.294 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.325 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.329 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920791.0762372, baada610-f563-4c97-89a9-56eba792c352 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.330 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.361 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.365 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.470 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.558 225859 INFO nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Took 8.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.558 225859 DEBUG nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:11 np0005588919 nova_compute[225855]: 2026-01-20 14:53:11.702 225859 INFO nova.compute.manager [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Took 10.96 seconds to build instance.#033[00m
Jan 20 09:53:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:11.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:12 np0005588919 nova_compute[225855]: 2026-01-20 14:53:12.009 225859 DEBUG oslo_concurrency.lockutils [None req-b1b59a96-b618-4ebc-b642-4b40ab846a49 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 20 09:53:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:13.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.156 225859 DEBUG nova.compute.manager [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.156 225859 DEBUG oslo_concurrency.lockutils [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.157 225859 DEBUG oslo_concurrency.lockutils [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.157 225859 DEBUG oslo_concurrency.lockutils [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.157 225859 DEBUG nova.compute.manager [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.157 225859 WARNING nova.compute.manager [req-54389736-f714-4176-ab5e-35f2958b4369 req-114b4779-60d6-44d8-b611-7283c38e8c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:53:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:13.728 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:13 np0005588919 nova_compute[225855]: 2026-01-20 14:53:13.729 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:13.730 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:53:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:13.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:14 np0005588919 nova_compute[225855]: 2026-01-20 14:53:14.469 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 20 09:53:14 np0005588919 nova_compute[225855]: 2026-01-20 14:53:14.679 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588919 NetworkManager[49104]: <info>  [1768920794.6799] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 20 09:53:14 np0005588919 NetworkManager[49104]: <info>  [1768920794.6808] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Jan 20 09:53:14 np0005588919 nova_compute[225855]: 2026-01-20 14:53:14.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:14 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:14Z|00462|binding|INFO|Releasing lport b033e9e6-9781-4424-a20f-7b48a14e2c80 from this chassis (sb_readonly=0)
Jan 20 09:53:14 np0005588919 nova_compute[225855]: 2026-01-20 14:53:14.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:15.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.189 225859 DEBUG nova.compute.manager [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.190 225859 DEBUG nova.compute.manager [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing instance network info cache due to event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.190 225859 DEBUG oslo_concurrency.lockutils [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.190 225859 DEBUG oslo_concurrency.lockutils [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.190 225859 DEBUG nova.network.neutron [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:53:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:15.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:15 np0005588919 nova_compute[225855]: 2026-01-20 14:53:15.973 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:16.413 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:16.413 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:16.414 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:17.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:17 np0005588919 nova_compute[225855]: 2026-01-20 14:53:17.404 225859 DEBUG nova.network.neutron [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updated VIF entry in instance network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:53:17 np0005588919 nova_compute[225855]: 2026-01-20 14:53:17.406 225859 DEBUG nova.network.neutron [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:17 np0005588919 nova_compute[225855]: 2026-01-20 14:53:17.430 225859 DEBUG oslo_concurrency.lockutils [req-a5a7be6b-8a88-4801-b46b-7998864a745a req-e3d1a0ca-02ab-4def-a695-1748cb1e0e51 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:17.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:18.732 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:19.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:19 np0005588919 nova_compute[225855]: 2026-01-20 14:53:19.470 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 20 09:53:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:19.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:20 np0005588919 nova_compute[225855]: 2026-01-20 14:53:20.975 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:21.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:21.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:22 np0005588919 podman[272524]: 2026-01-20 14:53:22.003155467 +0000 UTC m=+0.047687787 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:23.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:23Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:23Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:23.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:24 np0005588919 nova_compute[225855]: 2026-01-20 14:53:24.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:25.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:25.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:25 np0005588919 nova_compute[225855]: 2026-01-20 14:53:25.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:27.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:27.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:29.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:29 np0005588919 nova_compute[225855]: 2026-01-20 14:53:29.474 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:29.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:30 np0005588919 nova_compute[225855]: 2026-01-20 14:53:30.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:31.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:31 np0005588919 nova_compute[225855]: 2026-01-20 14:53:31.274 225859 DEBUG nova.compute.manager [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:31 np0005588919 nova_compute[225855]: 2026-01-20 14:53:31.324 225859 INFO nova.compute.manager [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] instance snapshotting#033[00m
Jan 20 09:53:31 np0005588919 nova_compute[225855]: 2026-01-20 14:53:31.642 225859 INFO nova.virt.libvirt.driver [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Beginning live snapshot process#033[00m
Jan 20 09:53:31 np0005588919 nova_compute[225855]: 2026-01-20 14:53:31.782 225859 DEBUG nova.virt.libvirt.imagebackend [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:53:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:31.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:31 np0005588919 nova_compute[225855]: 2026-01-20 14:53:31.994 225859 DEBUG nova.storage.rbd_utils [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(ff1d44216f744f44807da7676144e1fc) on rbd image(baada610-f563-4c97-89a9-56eba792c352_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:53:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 20 09:53:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:33.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:33 np0005588919 nova_compute[225855]: 2026-01-20 14:53:33.214 225859 DEBUG nova.storage.rbd_utils [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning vms/baada610-f563-4c97-89a9-56eba792c352_disk@ff1d44216f744f44807da7676144e1fc to images/132a812e-f4a2-4a8b-813d-1df62e09798a clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:53:33 np0005588919 nova_compute[225855]: 2026-01-20 14:53:33.332 225859 DEBUG nova.storage.rbd_utils [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] flattening images/132a812e-f4a2-4a8b-813d-1df62e09798a flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:53:33 np0005588919 nova_compute[225855]: 2026-01-20 14:53:33.738 225859 DEBUG nova.storage.rbd_utils [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] removing snapshot(ff1d44216f744f44807da7676144e1fc) on rbd image(baada610-f563-4c97-89a9-56eba792c352_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:53:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:33.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 20 09:53:34 np0005588919 nova_compute[225855]: 2026-01-20 14:53:34.236 225859 DEBUG nova.storage.rbd_utils [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] creating snapshot(snap) on rbd image(132a812e-f4a2-4a8b-813d-1df62e09798a) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:53:34 np0005588919 nova_compute[225855]: 2026-01-20 14:53:34.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:34 np0005588919 nova_compute[225855]: 2026-01-20 14:53:34.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:35.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 20 09:53:35 np0005588919 nova_compute[225855]: 2026-01-20 14:53:35.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:35 np0005588919 nova_compute[225855]: 2026-01-20 14:53:35.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:36 np0005588919 nova_compute[225855]: 2026-01-20 14:53:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:36 np0005588919 nova_compute[225855]: 2026-01-20 14:53:36.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:53:36 np0005588919 nova_compute[225855]: 2026-01-20 14:53:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:36 np0005588919 nova_compute[225855]: 2026-01-20 14:53:36.652 225859 INFO nova.virt.libvirt.driver [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Snapshot image upload complete#033[00m
Jan 20 09:53:36 np0005588919 nova_compute[225855]: 2026-01-20 14:53:36.652 225859 INFO nova.compute.manager [None req-57c441db-e1fb-4444-9764-df5914c151b4 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Took 5.33 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 09:53:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:37.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:37 np0005588919 nova_compute[225855]: 2026-01-20 14:53:37.375 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:37 np0005588919 nova_compute[225855]: 2026-01-20 14:53:37.375 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:53:37 np0005588919 nova_compute[225855]: 2026-01-20 14:53:37.376 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:53:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:37.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:38 np0005588919 nova_compute[225855]: 2026-01-20 14:53:38.009 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:38 np0005588919 nova_compute[225855]: 2026-01-20 14:53:38.010 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:38 np0005588919 nova_compute[225855]: 2026-01-20 14:53:38.010 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:53:38 np0005588919 nova_compute[225855]: 2026-01-20 14:53:38.010 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:39 np0005588919 nova_compute[225855]: 2026-01-20 14:53:39.478 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:39.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.793 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [{"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.813 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-23ea4537-f03f-46de-881f-b979e232a3b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.814 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.815 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.815 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.815 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.831 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:53:40 np0005588919 nova_compute[225855]: 2026-01-20 14:53:40.985 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:41 np0005588919 podman[272744]: 2026-01-20 14:53:41.038727019 +0000 UTC m=+0.078555938 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.127 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.128 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.145 225859 DEBUG nova.objects.instance [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.185 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.437 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.438 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.438 225859 INFO nova.compute.manager [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Attaching volume 5f6a803f-d232-4e97-9965-ece0139e0fda to /dev/vdb#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.609 225859 DEBUG os_brick.utils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.610 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.632 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.632 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[7d42ef2c-c0c6-42b6-8e4f-1a7778427568]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.633 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.640 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.640 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f4d919-689f-44f9-820b-c366bcf5312d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.642 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.650 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.650 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[42ba5721-f637-4239-ab0c-7931812b2f78]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.651 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[568cafb2-f4bd-4a0c-bcdc-92764a36f1c7]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.652 225859 DEBUG oslo_concurrency.processutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.675 225859 DEBUG oslo_concurrency.processutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.677 225859 DEBUG os_brick.initiator.connectors.lightos [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.677 225859 DEBUG os_brick.initiator.connectors.lightos [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.677 225859 DEBUG os_brick.initiator.connectors.lightos [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.678 225859 DEBUG os_brick.utils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:53:41 np0005588919 nova_compute[225855]: 2026-01-20 14:53:41.678 225859 DEBUG nova.virt.block_device [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating existing volume attachment record: d9b595f1-88ea-494e-bc6e-d959c2d6b8eb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:53:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:41.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.439 225859 DEBUG nova.objects.instance [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.461 225859 DEBUG nova.virt.libvirt.driver [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Attempting to attach volume 5f6a803f-d232-4e97-9965-ece0139e0fda with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.463 225859 DEBUG nova.virt.libvirt.guest [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-5f6a803f-d232-4e97-9965-ece0139e0fda">
Jan 20 09:53:42 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:53:42 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:53:42 np0005588919 nova_compute[225855]:  <serial>5f6a803f-d232-4e97-9965-ece0139e0fda</serial>
Jan 20 09:53:42 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:53:42 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:53:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.588 225859 DEBUG nova.virt.libvirt.driver [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.588 225859 DEBUG nova.virt.libvirt.driver [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.589 225859 DEBUG nova.virt.libvirt.driver [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.589 225859 DEBUG nova.virt.libvirt.driver [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:9e:93:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:53:42 np0005588919 nova_compute[225855]: 2026-01-20 14:53:42.781 225859 DEBUG oslo_concurrency.lockutils [None req-42a91944-3838-4633-a431-acd3dada7db3 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:43.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.356 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.356 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.374 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.374 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2478177324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.804 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:43.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.876 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.876 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.877 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000077 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.879 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:43 np0005588919 nova_compute[225855]: 2026-01-20 14:53:43.879 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.040 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.041 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3957MB free_disk=20.693958282470703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.041 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.042 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.419 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 23ea4537-f03f-46de-881f-b979e232a3b9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.420 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance baada610-f563-4c97-89a9-56eba792c352 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.420 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.420 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.494 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.511 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.511 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.554 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.587 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.652 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.794 225859 INFO nova.compute.manager [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Rescuing#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.794 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.794 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:44 np0005588919 nova_compute[225855]: 2026-01-20 14:53:44.795 225859 DEBUG nova.network.neutron [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1022852382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.101 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.106 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.122 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.144 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.144 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:45.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:45.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:45 np0005588919 nova_compute[225855]: 2026-01-20 14:53:45.987 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:46 np0005588919 nova_compute[225855]: 2026-01-20 14:53:46.225 225859 DEBUG nova.network.neutron [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:46 np0005588919 nova_compute[225855]: 2026-01-20 14:53:46.248 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:46 np0005588919 nova_compute[225855]: 2026-01-20 14:53:46.678 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:53:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:47.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:47.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:48 np0005588919 kernel: tapa3156414-5a (unregistering): left promiscuous mode
Jan 20 09:53:48 np0005588919 NetworkManager[49104]: <info>  [1768920828.9583] device (tapa3156414-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:53:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:48Z|00463|binding|INFO|Releasing lport a3156414-5a96-462d-974e-a57c9cd8e9c8 from this chassis (sb_readonly=0)
Jan 20 09:53:48 np0005588919 nova_compute[225855]: 2026-01-20 14:53:48.967 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:48Z|00464|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 down in Southbound
Jan 20 09:53:48 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:48Z|00465|binding|INFO|Removing iface tapa3156414-5a ovn-installed in OVS
Jan 20 09:53:48 np0005588919 nova_compute[225855]: 2026-01-20 14:53:48.970 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:48.974 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:48.976 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:53:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:48.978 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:48 np0005588919 nova_compute[225855]: 2026-01-20 14:53:48.984 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:48.996 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2390b2-7f6b-4ef0-a74f-3bf53ef1a6b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.020 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[af593159-1fa6-4165-9bb9-a74796117e2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.023 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5b69457d-7c51-4c0f-a14a-76358079c366]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 20 09:53:49 np0005588919 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000077.scope: Consumed 14.199s CPU time.
Jan 20 09:53:49 np0005588919 systemd-machined[194361]: Machine qemu-55-instance-00000077 terminated.
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.049 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f1207d-fea6-41a1-903d-1c1d431a75f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.063 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdb1e0f-1a7b-49a9-9f4f-44c2adeb9566]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272857, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.077 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[24a131fd-593e-4912-84f3-fa8d24155991]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272858, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272858, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.078 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.084 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.084 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.084 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.085 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:49.085 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.123 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.124 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.183 225859 DEBUG nova.compute.manager [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.184 225859 DEBUG oslo_concurrency.lockutils [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.184 225859 DEBUG oslo_concurrency.lockutils [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.184 225859 DEBUG oslo_concurrency.lockutils [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.184 225859 DEBUG nova.compute.manager [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.185 225859 WARNING nova.compute.manager [req-5a7132ee-511d-4c70-bad0-c5da7ae62a33 req-5906cafd-9810-45c5-b126-bc8598783283 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.188 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:49.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.483 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.693 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.698 225859 INFO nova.virt.libvirt.driver [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance destroyed successfully.#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.698 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.713 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Attempting a stable device rescue#033[00m
Jan 20 09:53:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:49.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.938 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.943 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.944 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Creating image(s)#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.970 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:49 np0005588919 nova_compute[225855]: 2026-01-20 14:53:49.974 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'trusted_certs' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.010 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.038 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.043 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "bbd525c47a2c08c6db0d918bbd4125fe578740ee" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.044 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "bbd525c47a2c08c6db0d918bbd4125fe578740ee" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.264 225859 DEBUG nova.virt.libvirt.imagebackend [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/132a812e-f4a2-4a8b-813d-1df62e09798a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/132a812e-f4a2-4a8b-813d-1df62e09798a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.328 225859 DEBUG nova.virt.libvirt.imagebackend [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/132a812e-f4a2-4a8b-813d-1df62e09798a/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.329 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] cloning images/132a812e-f4a2-4a8b-813d-1df62e09798a@snap to None/baada610-f563-4c97-89a9-56eba792c352_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.454 225859 DEBUG oslo_concurrency.lockutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "bbd525c47a2c08c6db0d918bbd4125fe578740ee" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.505 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'migration_context' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.521 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.524 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start _get_guest_xml network_info=[{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:9e:93:82"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '132a812e-f4a2-4a8b-813d-1df62e09798a', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-5f6a803f-d232-4e97-9965-ece0139e0fda', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '5f6a803f-d232-4e97-9965-ece0139e0fda', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'baada610-f563-4c97-89a9-56eba792c352', 'attached_at': '', 'detached_at': '', 'volume_id': '5f6a803f-d232-4e97-9965-ece0139e0fda', 'serial': '5f6a803f-d232-4e97-9965-ece0139e0fda'}, 'guest_format': None, 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': 'd9b595f1-88ea-494e-bc6e-d959c2d6b8eb', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.524 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.547 225859 WARNING nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.553 225859 DEBUG nova.virt.libvirt.host [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.554 225859 DEBUG nova.virt.libvirt.host [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.558 225859 DEBUG nova.virt.libvirt.host [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.559 225859 DEBUG nova.virt.libvirt.host [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.560 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.560 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.561 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.561 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.561 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.561 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.562 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.562 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.562 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.562 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.563 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.563 225859 DEBUG nova.virt.hardware [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.563 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'vcpu_model' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.580 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:50 np0005588919 nova_compute[225855]: 2026-01-20 14:53:50.989 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2500595433' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.112 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.153 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:51.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.350 225859 DEBUG nova.compute.manager [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.351 225859 DEBUG oslo_concurrency.lockutils [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.351 225859 DEBUG oslo_concurrency.lockutils [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.351 225859 DEBUG oslo_concurrency.lockutils [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.352 225859 DEBUG nova.compute.manager [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.352 225859 WARNING nova.compute.manager [req-3ee7a85b-bfae-4355-b170-83735e42285b req-12f5da0a-088d-4e20-a73c-f375ee777225 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:53:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3735164888' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.617 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:51 np0005588919 nova_compute[225855]: 2026-01-20 14:53:51.646 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:51.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:53:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2119220250' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.105 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.108 225859 DEBUG nova.virt.libvirt.vif [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-185388239',display_name='tempest-ServerStableDeviceRescueTest-server-185388239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-185388239',id=119,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIfbHk5SMnBEAUVZEhoLPfdB2qCay341zK720hYW5qflxdgcEr+fHp9C3kAgJFmqON8wn8DkPxW0WmihyCLPTK7Iiiy5VDiRJ7U/0O7hlyzm17ZWhCVdPfXSugKxmeVL3w==',key_name='tempest-keypair-647310408',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-d9074tfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=baada610-f563-4c97-89a9-56eba792c352,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:9e:93:82"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.109 225859 DEBUG nova.network.os_vif_util [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "vif_mac": "fa:16:3e:9e:93:82"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.110 225859 DEBUG nova.network.os_vif_util [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.111 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'pci_devices' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.126 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <uuid>baada610-f563-4c97-89a9-56eba792c352</uuid>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <name>instance-00000077</name>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-185388239</nova:name>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:53:50</nova:creationTime>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:user uuid="d85d286ce6224326a0f4a15a06afbfea">tempest-ServerStableDeviceRescueTest-129078052-project-member</nova:user>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:project uuid="0a29915e0dd2403fbd7b7e847696b00a">tempest-ServerStableDeviceRescueTest-129078052</nova:project>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <nova:port uuid="a3156414-5a96-462d-974e-a57c9cd8e9c8">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="serial">baada610-f563-4c97-89a9-56eba792c352</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="uuid">baada610-f563-4c97-89a9-56eba792c352</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/baada610-f563-4c97-89a9-56eba792c352_disk">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/baada610-f563-4c97-89a9-56eba792c352_disk.config">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-5f6a803f-d232-4e97-9965-ece0139e0fda">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <serial>5f6a803f-d232-4e97-9965-ece0139e0fda</serial>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/baada610-f563-4c97-89a9-56eba792c352_disk.rescue">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <target dev="vdc" bus="virtio"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <boot order="1"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:9e:93:82"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <target dev="tapa3156414-5a"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/console.log" append="off"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:53:52 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:53:52 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:53:52 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:53:52 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.135 225859 INFO nova.virt.libvirt.driver [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance destroyed successfully.#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.217 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.217 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.218 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.218 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.218 225859 DEBUG nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] No VIF found with MAC fa:16:3e:9e:93:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.218 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Using config drive#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.252 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:52 np0005588919 podman[273147]: 2026-01-20 14:53:52.279756306 +0000 UTC m=+0.103557784 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.281 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'ec2_ids' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:52 np0005588919 nova_compute[225855]: 2026-01-20 14:53:52.310 225859 DEBUG nova.objects.instance [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'keypairs' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:53.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.397 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Creating config drive at /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.403 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1540c4je execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.523 225859 DEBUG nova.compute.manager [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.543 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1540c4je" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.574 225859 DEBUG nova.storage.rbd_utils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] rbd image baada610-f563-4c97-89a9-56eba792c352_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.579 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue baada610-f563-4c97-89a9-56eba792c352_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.648 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.649 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.696 225859 DEBUG nova.objects.instance [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.714 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.714 225859 INFO nova.compute.claims [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.714 225859 DEBUG nova.objects.instance [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.725 225859 DEBUG nova.objects.instance [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.746 225859 DEBUG oslo_concurrency.processutils [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue baada610-f563-4c97-89a9-56eba792c352_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.747 225859 INFO nova.virt.libvirt.driver [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Deleting local config drive /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.770 225859 INFO nova.compute.resource_tracker [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating resource usage from migration 0c07b1b8-dd18-4f00-acbf-59c21d8f4a60#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.771 225859 DEBUG nova.compute.resource_tracker [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Starting to track incoming migration 0c07b1b8-dd18-4f00-acbf-59c21d8f4a60 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 09:53:53 np0005588919 kernel: tapa3156414-5a: entered promiscuous mode
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.804 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:53 np0005588919 NetworkManager[49104]: <info>  [1768920833.8070] manager: (tapa3156414-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/202)
Jan 20 09:53:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:53Z|00466|binding|INFO|Claiming lport a3156414-5a96-462d-974e-a57c9cd8e9c8 for this chassis.
Jan 20 09:53:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:53Z|00467|binding|INFO|a3156414-5a96-462d-974e-a57c9cd8e9c8: Claiming fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.812 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.813 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.815 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.824 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:53Z|00468|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 ovn-installed in OVS
Jan 20 09:53:53 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:53Z|00469|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 up in Southbound
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.827 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.840 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[22142fac-ddc7-4561-9209-9e4f3dd318c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 systemd-machined[194361]: New machine qemu-56-instance-00000077.
Jan 20 09:53:53 np0005588919 systemd-udevd[273243]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:53:53 np0005588919 NetworkManager[49104]: <info>  [1768920833.8722] device (tapa3156414-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:53:53 np0005588919 systemd[1]: Started Virtual Machine qemu-56-instance-00000077.
Jan 20 09:53:53 np0005588919 NetworkManager[49104]: <info>  [1768920833.8730] device (tapa3156414-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:53:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:53.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.880 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.881 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0b65bb6e-a859-4966-a314-fa3b55f49577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.884 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d88a4d44-1457-4e89-a3ae-441335f49495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.918 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6c55ac96-97d1-4480-9230-70acdc5e5e73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.942 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d846da68-88c4-4302-998b-89a6de413fb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 700, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 10, 'rx_bytes': 700, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273255, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.963 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c9079d19-b34a-462e-ae91-f574e35b361b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273257, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273257, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.965 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.969 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:53 np0005588919 nova_compute[225855]: 2026-01-20 14:53:53.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.970 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.970 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:53.971 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.059 225859 DEBUG nova.compute.manager [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.059 225859 DEBUG oslo_concurrency.lockutils [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.060 225859 DEBUG oslo_concurrency.lockutils [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.060 225859 DEBUG oslo_concurrency.lockutils [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.061 225859 DEBUG nova.compute.manager [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.061 225859 WARNING nova.compute.manager [req-c3076cef-b50f-4260-bd78-2aabd5bf4d86 req-feb13a46-a55b-44a9-8a15-b687a30645f6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:53:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:53:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613345111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.335 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.344 225859 DEBUG nova.compute.provider_tree [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.353 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for baada610-f563-4c97-89a9-56eba792c352 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.354 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920834.3524303, baada610-f563-4c97-89a9-56eba792c352 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.354 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.359 225859 DEBUG nova.compute.manager [None req-154f5cfe-1d4e-4261-b272-cf70d3757d76 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.376 225859 DEBUG nova.scheduler.client.report [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.381 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.386 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.402 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.402 225859 INFO nova.compute.manager [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Migrating#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.411 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.412 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920834.3528485, baada610-f563-4c97-89a9-56eba792c352 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.412 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Started (Lifecycle Event)#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.446 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.452 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:54 np0005588919 nova_compute[225855]: 2026-01-20 14:53:54.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:53:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:55.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:53:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:55.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:55 np0005588919 nova_compute[225855]: 2026-01-20 14:53:55.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.179 225859 DEBUG nova.compute.manager [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.180 225859 DEBUG oslo_concurrency.lockutils [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.181 225859 DEBUG oslo_concurrency.lockutils [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.181 225859 DEBUG oslo_concurrency.lockutils [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.182 225859 DEBUG nova.compute.manager [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.183 225859 WARNING nova.compute.manager [req-7bad2a47-4454-44e1-bea1-bf7c226506f4 req-ea90f2b2-e58f-49bc-91b3-87fec2c247a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.191 225859 INFO nova.compute.manager [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Unrescuing#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.192 225859 DEBUG oslo_concurrency.lockutils [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.192 225859 DEBUG oslo_concurrency.lockutils [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:56 np0005588919 nova_compute[225855]: 2026-01-20 14:53:56.192 225859 DEBUG nova.network.neutron [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:56 np0005588919 systemd-logind[783]: New session 66 of user nova.
Jan 20 09:53:56 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 09:53:56 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 09:53:56 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 09:53:56 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 09:53:56 np0005588919 systemd[273362]: Queued start job for default target Main User Target.
Jan 20 09:53:56 np0005588919 systemd[273362]: Created slice User Application Slice.
Jan 20 09:53:56 np0005588919 systemd[273362]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:53:56 np0005588919 systemd[273362]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 09:53:56 np0005588919 systemd[273362]: Reached target Paths.
Jan 20 09:53:56 np0005588919 systemd[273362]: Reached target Timers.
Jan 20 09:53:56 np0005588919 systemd[273362]: Starting D-Bus User Message Bus Socket...
Jan 20 09:53:56 np0005588919 systemd[273362]: Starting Create User's Volatile Files and Directories...
Jan 20 09:53:56 np0005588919 systemd[273362]: Finished Create User's Volatile Files and Directories.
Jan 20 09:53:56 np0005588919 systemd[273362]: Listening on D-Bus User Message Bus Socket.
Jan 20 09:53:56 np0005588919 systemd[273362]: Reached target Sockets.
Jan 20 09:53:56 np0005588919 systemd[273362]: Reached target Basic System.
Jan 20 09:53:56 np0005588919 systemd[273362]: Reached target Main User Target.
Jan 20 09:53:56 np0005588919 systemd[273362]: Startup finished in 169ms.
Jan 20 09:53:56 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 09:53:56 np0005588919 systemd[1]: Started Session 66 of User nova.
Jan 20 09:53:56 np0005588919 systemd[1]: session-66.scope: Deactivated successfully.
Jan 20 09:53:56 np0005588919 systemd-logind[783]: Session 66 logged out. Waiting for processes to exit.
Jan 20 09:53:56 np0005588919 systemd-logind[783]: Removed session 66.
Jan 20 09:53:56 np0005588919 systemd-logind[783]: New session 68 of user nova.
Jan 20 09:53:56 np0005588919 systemd[1]: Started Session 68 of User nova.
Jan 20 09:53:56 np0005588919 systemd[1]: session-68.scope: Deactivated successfully.
Jan 20 09:53:56 np0005588919 systemd-logind[783]: Session 68 logged out. Waiting for processes to exit.
Jan 20 09:53:56 np0005588919 systemd-logind[783]: Removed session 68.
Jan 20 09:53:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:53:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:57.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:53:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:53:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:57.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.029 225859 INFO nova.network.neutron [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating port 87b0cab5-af2f-4440-8f58-840860a23f68 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.046 225859 DEBUG nova.network.neutron [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.066 225859 DEBUG oslo_concurrency.lockutils [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.068 225859 DEBUG nova.objects.instance [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:58 np0005588919 kernel: tapa3156414-5a (unregistering): left promiscuous mode
Jan 20 09:53:58 np0005588919 NetworkManager[49104]: <info>  [1768920838.1586] device (tapa3156414-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.168 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00470|binding|INFO|Releasing lport a3156414-5a96-462d-974e-a57c9cd8e9c8 from this chassis (sb_readonly=0)
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00471|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 down in Southbound
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00472|binding|INFO|Removing iface tapa3156414-5a ovn-installed in OVS
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.173 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.179 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.180 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.182 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.188 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.204 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[86d6caac-b776-43bb-b2ac-cb65018bd1dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 20 09:53:58 np0005588919 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000077.scope: Consumed 4.546s CPU time.
Jan 20 09:53:58 np0005588919 systemd-machined[194361]: Machine qemu-56-instance-00000077 terminated.
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.242 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[75ee5a2d-464e-4db0-b05f-708405d9d9cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.245 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[628b6aa4-c31c-40d5-8097-a70e59371362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.281 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[804ce5d9-4e7d-456d-8ea0-bbd1b6b56777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.297 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[67bad1d1-ab03-40af-8456-5d235cc69f88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 700, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 12, 'rx_bytes': 700, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273398, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.312 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6198531e-c6b4-4eb4-afec-041b397182b8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273399, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273399, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 kernel: tapa3156414-5a: entered promiscuous mode
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.314 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 kernel: tapa3156414-5a (unregistering): left promiscuous mode
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00473|binding|INFO|Claiming lport a3156414-5a96-462d-974e-a57c9cd8e9c8 for this chassis.
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00474|binding|INFO|a3156414-5a96-462d-974e-a57c9cd8e9c8: Claiming fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.319 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.333 225859 INFO nova.virt.libvirt.driver [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance destroyed successfully.#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.333 225859 DEBUG nova.objects.instance [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'numa_topology' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00475|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 ovn-installed in OVS
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00476|if_status|INFO|Dropped 2 log messages in last 510 seconds (most recently, 510 seconds ago) due to excessive rate
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00477|if_status|INFO|Not setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 down as sb is readonly
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.339 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.342 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.343 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.343 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.344 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00478|binding|INFO|Releasing lport a3156414-5a96-462d-974e-a57c9cd8e9c8 from this chassis (sb_readonly=0)
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.578 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.579 225859 DEBUG nova.compute.manager [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.579 225859 DEBUG oslo_concurrency.lockutils [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.580 225859 DEBUG oslo_concurrency.lockutils [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.580 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 bound to our chassis#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.580 225859 DEBUG oslo_concurrency.lockutils [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.580 225859 DEBUG nova.compute.manager [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.580 225859 WARNING nova.compute.manager [req-5be064b3-afcb-4cf0-93a1-124f121e3640 req-5f591b7a-4bff-44db-a009-e1682b3492d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.582 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.586 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.596 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[db283db5-3229-486e-9b71-1e218f0610f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.630 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae8356a-572e-4416-b62d-433fcd46720c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.634 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8cbc59e1-1246-464e-860c-3303918d9c3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.669 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b57f2895-1224-499e-871b-9a0456a0d772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 kernel: tapa3156414-5a: entered promiscuous mode
Jan 20 09:53:58 np0005588919 systemd-udevd[273389]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:53:58 np0005588919 NetworkManager[49104]: <info>  [1768920838.6971] manager: (tapa3156414-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00479|binding|INFO|Claiming lport a3156414-5a96-462d-974e-a57c9cd8e9c8 for this chassis.
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00480|binding|INFO|a3156414-5a96-462d-974e-a57c9cd8e9c8: Claiming fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.697 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5bc2f5-385c-4042-b0b0-0f586b71fadd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 700, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 14, 'rx_bytes': 700, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273420, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.697 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00481|binding|INFO|Removing lport a3156414-5a96-462d-974e-a57c9cd8e9c8 ovn-installed in OVS
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.699 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.703 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:53:58 np0005588919 NetworkManager[49104]: <info>  [1768920838.7062] device (tapa3156414-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:53:58 np0005588919 NetworkManager[49104]: <info>  [1768920838.7074] device (tapa3156414-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00482|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 ovn-installed in OVS
Jan 20 09:53:58 np0005588919 ovn_controller[130490]: 2026-01-20T14:53:58Z|00483|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 up in Southbound
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.717 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.717 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b524f4e-452e-45c2-8712-8bd7a428fef7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273426, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273426, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.718 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.720 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 systemd-machined[194361]: New machine qemu-57-instance-00000077.
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.726 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.727 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.728 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.728 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.730 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.732 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:58 np0005588919 systemd[1]: Started Virtual Machine qemu-57-instance-00000077.
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.748 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[84943a53-8a07-4859-8d50-3238a7b92e2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.773 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[05143511-b8e1-430f-a1bf-26d14dc06de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.778 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd727da-3f6b-49eb-959f-9d3042fbc1b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.806 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5789022f-7f89-4632-a04d-be77ae80686c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.822 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3799457a-3fa2-48f7-b752-385470106bf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 700, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 16, 'rx_bytes': 700, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273440, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.837 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[76acbc39-5b53-4443-8001-46c507adf1e9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273442, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273442, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.840 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 nova_compute[225855]: 2026-01-20 14:53:58.841 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.843 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.843 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.844 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.844 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.845 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.847 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.862 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0fdb79-68e1-424f-ad07-682123c0ec1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.889 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6ddba788-afe1-40a6-8938-b89b5c3eb3e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.893 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b3df50-83f0-40bc-b9d6-d8762cce2b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.925 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1694ac-0df4-4643-b3e4-c9d761d9ee40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.945 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14b4a2b3-2fa1-4caa-8c04-a876518f4b3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 18, 'rx_bytes': 700, 'tx_bytes': 944, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 18, 'rx_bytes': 700, 'tx_bytes': 944, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273455, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.966 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[db0e5fca-a37a-498d-a8b7-39dce1176100]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273465, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273465, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:53:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:58.968 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:59.125 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:59.125 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:59.126 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:53:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:53:59.126 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:53:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:53:59.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.312 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for baada610-f563-4c97-89a9-56eba792c352 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.313 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920839.3113894, baada610-f563-4c97-89a9-56eba792c352 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.313 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.336 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.337 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.337 225859 DEBUG nova.network.neutron [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.348 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.352 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.369 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.370 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920839.3117814, baada610-f563-4c97-89a9-56eba792c352 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.370 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Started (Lifecycle Event)#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.399 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.402 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.422 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.437 225859 DEBUG nova.compute.manager [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-changed-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.438 225859 DEBUG nova.compute.manager [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Refreshing instance network info cache due to event network-changed-87b0cab5-af2f-4440-8f58-840860a23f68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.438 225859 DEBUG oslo_concurrency.lockutils [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.486 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:53:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 20 09:53:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:53:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:53:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:53:59.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.936 225859 DEBUG nova.compute.manager [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.937 225859 DEBUG nova.compute.manager [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing instance network info cache due to event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.937 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.938 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:53:59 np0005588919 nova_compute[225855]: 2026-01-20 14:53:59.938 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.110 225859 DEBUG nova.compute.manager [None req-17d1ade6-ce2c-4ceb-86ef-fdd82f9c9feb d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.691 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.692 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.692 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.693 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.693 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.693 225859 WARNING nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.693 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.694 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.694 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.694 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.695 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.695 225859 WARNING nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.695 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.695 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.696 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.696 225859 DEBUG oslo_concurrency.lockutils [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.696 225859 DEBUG nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.697 225859 WARNING nova.compute.manager [req-fdedb619-36f7-409b-abb1-269c8359c78f req-77621fa7-e140-476a-820a-32294f004028 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:00 np0005588919 nova_compute[225855]: 2026-01-20 14:54:00.995 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.072 225859 DEBUG nova.network.neutron [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating instance_info_cache with network_info: [{"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.099 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.104 225859 DEBUG oslo_concurrency.lockutils [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.105 225859 DEBUG nova.network.neutron [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Refreshing network info cache for port 87b0cab5-af2f-4440-8f58-840860a23f68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:54:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:01.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.249 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 09:54:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:01Z|00484|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.253 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.254 225859 INFO nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Creating image(s)#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.322 225859 DEBUG nova.storage.rbd_utils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] creating snapshot(nova-resize) on rbd image(7f5cfffe-c1dc-4b00-844e-0fb35b340f44_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:54:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.748 225859 DEBUG nova.objects.instance [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:01.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.881 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.882 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Ensure instance console log exists: /var/lib/nova/instances/7f5cfffe-c1dc-4b00-844e-0fb35b340f44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.882 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.882 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.883 225859 DEBUG oslo_concurrency.lockutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.887 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Start _get_guest_xml network_info=[{"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1445030024-network", "vif_mac": "fa:16:3e:2b:79:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.890 225859 WARNING nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.895 225859 DEBUG nova.virt.libvirt.host [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.896 225859 DEBUG nova.virt.libvirt.host [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.900 225859 DEBUG nova.virt.libvirt.host [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.901 225859 DEBUG nova.virt.libvirt.host [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.903 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.904 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.904 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.905 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.905 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.905 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.905 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.905 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.906 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.906 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.907 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.907 225859 DEBUG nova.virt.hardware [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.908 225859 DEBUG nova.objects.instance [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:01 np0005588919 nova_compute[225855]: 2026-01-20 14:54:01.924 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3505285473' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.094 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updated VIF entry in instance network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.094 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.119 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.120 225859 DEBUG nova.compute.manager [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.120 225859 DEBUG nova.compute.manager [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing instance network info cache due to event network-changed-a3156414-5a96-462d-974e-a57c9cd8e9c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.121 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.121 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.121 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Refreshing network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2647610533' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.363 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.403 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.739 225859 DEBUG nova.network.neutron [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updated VIF entry in instance network info cache for port 87b0cab5-af2f-4440-8f58-840860a23f68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.740 225859 DEBUG nova.network.neutron [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating instance_info_cache with network_info: [{"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.757 225859 DEBUG oslo_concurrency.lockutils [req-3cf1c8fb-7ccd-4908-8ac4-22779402973f req-537fae91-771c-49c0-89be-305c6a3f71f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1618823478' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.929 225859 DEBUG oslo_concurrency.processutils [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.931 225859 DEBUG nova.virt.libvirt.vif [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:52:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1445030024-network", "vif_mac": "fa:16:3e:2b:79:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.931 225859 DEBUG nova.network.os_vif_util [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1445030024-network", "vif_mac": "fa:16:3e:2b:79:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.932 225859 DEBUG nova.network.os_vif_util [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.936 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <uuid>7f5cfffe-c1dc-4b00-844e-0fb35b340f44</uuid>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <name>instance-00000076</name>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherB-server-1654627482</nova:name>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:54:01</nova:creationTime>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:user uuid="215db37373dc4ae5a75cbd6866f471da">tempest-ServerActionsTestOtherB-1136521362-project-member</nova:user>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:project uuid="b3b1b7f5b4f84b5abbc401eb577c85c0">tempest-ServerActionsTestOtherB-1136521362</nova:project>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <nova:port uuid="87b0cab5-af2f-4440-8f58-840860a23f68">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="serial">7f5cfffe-c1dc-4b00-844e-0fb35b340f44</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="uuid">7f5cfffe-c1dc-4b00-844e-0fb35b340f44</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_disk">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_disk.config">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:2b:79:1b"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <target dev="tap87b0cab5-af"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7f5cfffe-c1dc-4b00-844e-0fb35b340f44/console.log" append="off"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:54:02 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:54:02 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:54:02 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:54:02 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.943 225859 DEBUG nova.virt.libvirt.vif [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:52:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:53:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1445030024-network", "vif_mac": "fa:16:3e:2b:79:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.943 225859 DEBUG nova.network.os_vif_util [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1445030024-network", "vif_mac": "fa:16:3e:2b:79:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.944 225859 DEBUG nova.network.os_vif_util [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.944 225859 DEBUG os_vif [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.945 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.949 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87b0cab5-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.949 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87b0cab5-af, col_values=(('external_ids', {'iface-id': '87b0cab5-af2f-4440-8f58-840860a23f68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:79:1b', 'vm-uuid': '7f5cfffe-c1dc-4b00-844e-0fb35b340f44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:02 np0005588919 NetworkManager[49104]: <info>  [1768920842.9530] manager: (tap87b0cab5-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.953 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:02 np0005588919 nova_compute[225855]: 2026-01-20 14:54:02.959 225859 INFO os_vif [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af')#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.030 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.030 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.030 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] No VIF found with MAC fa:16:3e:2b:79:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.031 225859 INFO nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Using config drive#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.059 225859 DEBUG nova.compute.manager [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.060 225859 DEBUG nova.virt.libvirt.driver [None req-469b4f3f-ba11-47de-b608-d3c23b30ada2 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 09:54:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:03.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.639 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updated VIF entry in instance network info cache for port a3156414-5a96-462d-974e-a57c9cd8e9c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.640 225859 DEBUG nova.network.neutron [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [{"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:03 np0005588919 nova_compute[225855]: 2026-01-20 14:54:03.657 225859 DEBUG oslo_concurrency.lockutils [req-1e42ad7b-2695-4540-a7a9-7a729e69d132 req-15cc6379-4caf-4b2a-9ff1-99f169566279 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-baada610-f563-4c97-89a9-56eba792c352" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:03.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:05.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:05.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:06 np0005588919 nova_compute[225855]: 2026-01-20 14:54:05.997 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:07 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 09:54:07 np0005588919 systemd[273362]: Activating special unit Exit the Session...
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped target Main User Target.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped target Basic System.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped target Paths.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped target Sockets.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped target Timers.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 09:54:07 np0005588919 systemd[273362]: Closed D-Bus User Message Bus Socket.
Jan 20 09:54:07 np0005588919 systemd[273362]: Stopped Create User's Volatile Files and Directories.
Jan 20 09:54:07 np0005588919 systemd[273362]: Removed slice User Application Slice.
Jan 20 09:54:07 np0005588919 systemd[273362]: Reached target Shutdown.
Jan 20 09:54:07 np0005588919 systemd[273362]: Finished Exit the Session.
Jan 20 09:54:07 np0005588919 systemd[273362]: Reached target Exit the Session.
Jan 20 09:54:07 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 09:54:07 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 09:54:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:07.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:07 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 09:54:07 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 09:54:07 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 09:54:07 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 09:54:07 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 09:54:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 20 09:54:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:07.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:07 np0005588919 nova_compute[225855]: 2026-01-20 14:54:07.952 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:09.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.740662) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849740772, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 2156, "num_deletes": 265, "total_data_size": 4584616, "memory_usage": 4651520, "flush_reason": "Manual Compaction"}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849773221, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2997055, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46914, "largest_seqno": 49064, "table_properties": {"data_size": 2987998, "index_size": 5615, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19297, "raw_average_key_size": 20, "raw_value_size": 2969678, "raw_average_value_size": 3203, "num_data_blocks": 241, "num_entries": 927, "num_filter_entries": 927, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920710, "oldest_key_time": 1768920710, "file_creation_time": 1768920849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 32621 microseconds, and 12490 cpu microseconds.
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.773284) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2997055 bytes OK
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.773309) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.775399) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.775420) EVENT_LOG_v1 {"time_micros": 1768920849775413, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.775446) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 4574827, prev total WAL file size 4574827, number of live WAL files 2.
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.777333) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353133' seq:72057594037927935, type:22 .. '6C6F676D0031373635' seq:0, type:0; will stop at (end)
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2926KB)], [90(10MB)]
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849777389, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 13792193, "oldest_snapshot_seqno": -1}
Jan 20 09:54:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:09.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7596 keys, 13633097 bytes, temperature: kUnknown
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849926788, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 13633097, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13579214, "index_size": 33803, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19013, "raw_key_size": 195234, "raw_average_key_size": 25, "raw_value_size": 13440509, "raw_average_value_size": 1769, "num_data_blocks": 1347, "num_entries": 7596, "num_filter_entries": 7596, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.927192) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 13633097 bytes
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.929291) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 92.2 rd, 91.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 10.3 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(9.2) write-amplify(4.5) OK, records in: 8139, records dropped: 543 output_compression: NoCompression
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.929314) EVENT_LOG_v1 {"time_micros": 1768920849929302, "job": 56, "event": "compaction_finished", "compaction_time_micros": 149610, "compaction_time_cpu_micros": 50104, "output_level": 6, "num_output_files": 1, "total_output_size": 13633097, "num_input_records": 8139, "num_output_records": 7596, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849930076, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920849932797, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.777165) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.932851) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.932883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.932885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.932887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:09.932889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:54:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:54:11 np0005588919 nova_compute[225855]: 2026-01-20 14:54:11.000 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:11.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:11 np0005588919 podman[273852]: 2026-01-20 14:54:11.837077532 +0000 UTC m=+0.098053789 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 09:54:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:11.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.634 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'flavor' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.660 225859 DEBUG oslo_concurrency.lockutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.660 225859 DEBUG oslo_concurrency.lockutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquired lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.661 225859 DEBUG nova.network.neutron [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.661 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'info_cache' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:12 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:12Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:93:82 10.100.0.3
Jan 20 09:54:12 np0005588919 nova_compute[225855]: 2026-01-20 14:54:12.956 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:13.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:13.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.685 225859 DEBUG nova.network.neutron [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating instance_info_cache with network_info: [{"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.741 225859 DEBUG oslo_concurrency.lockutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Releasing lock "refresh_cache-7f5cfffe-c1dc-4b00-844e-0fb35b340f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.790 225859 INFO nova.virt.libvirt.driver [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance destroyed successfully.#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.791 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.808 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.831 225859 DEBUG nova.virt.libvirt.vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.832 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.833 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.833 225859 DEBUG os_vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.836 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87b0cab5-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.837 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.840 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.843 225859 INFO os_vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af')#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.849 225859 DEBUG nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Start _get_guest_xml network_info=[{"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.854 225859 WARNING nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.860 225859 DEBUG nova.virt.libvirt.host [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.860 225859 DEBUG nova.virt.libvirt.host [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.863 225859 DEBUG nova.virt.libvirt.host [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.864 225859 DEBUG nova.virt.libvirt.host [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.864 225859 DEBUG nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.865 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.865 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.865 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.865 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.866 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.866 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.866 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.866 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.867 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.867 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.867 225859 DEBUG nova.virt.hardware [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.867 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:14 np0005588919 nova_compute[225855]: 2026-01-20 14:54:14.881 225859 DEBUG oslo_concurrency.processutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:15.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/474916819' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.352 225859 DEBUG oslo_concurrency.processutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.398 225859 DEBUG oslo_concurrency.processutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4265517374' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.836 225859 DEBUG oslo_concurrency.processutils [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.840 225859 DEBUG nova.virt.libvirt.vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.841 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.842 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.845 225859 DEBUG nova.objects.instance [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:15.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.933 225859 DEBUG nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <uuid>7f5cfffe-c1dc-4b00-844e-0fb35b340f44</uuid>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <name>instance-00000076</name>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerActionsTestOtherB-server-1654627482</nova:name>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:54:14</nova:creationTime>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:user uuid="215db37373dc4ae5a75cbd6866f471da">tempest-ServerActionsTestOtherB-1136521362-project-member</nova:user>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:project uuid="b3b1b7f5b4f84b5abbc401eb577c85c0">tempest-ServerActionsTestOtherB-1136521362</nova:project>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <nova:port uuid="87b0cab5-af2f-4440-8f58-840860a23f68">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="serial">7f5cfffe-c1dc-4b00-844e-0fb35b340f44</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="uuid">7f5cfffe-c1dc-4b00-844e-0fb35b340f44</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_disk">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_disk.config">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:2b:79:1b"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <target dev="tap87b0cab5-af"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7f5cfffe-c1dc-4b00-844e-0fb35b340f44/console.log" append="off"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:54:15 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:54:15 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:54:15 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:54:15 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.935 225859 DEBUG nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.935 225859 DEBUG nova.virt.libvirt.driver [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.936 225859 DEBUG nova.virt.libvirt.vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.937 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.938 225859 DEBUG nova.network.os_vif_util [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.939 225859 DEBUG os_vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.939 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.940 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.940 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.944 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.945 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87b0cab5-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87b0cab5-af, col_values=(('external_ids', {'iface-id': '87b0cab5-af2f-4440-8f58-840860a23f68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:79:1b', 'vm-uuid': '7f5cfffe-c1dc-4b00-844e-0fb35b340f44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.947 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:15 np0005588919 NetworkManager[49104]: <info>  [1768920855.9485] manager: (tap87b0cab5-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.956 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:15 np0005588919 nova_compute[225855]: 2026-01-20 14:54:15.957 225859 INFO os_vif [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af')#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 kernel: tap87b0cab5-af: entered promiscuous mode
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.0191] manager: (tap87b0cab5-af): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Jan 20 09:54:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:16Z|00485|binding|INFO|Claiming lport 87b0cab5-af2f-4440-8f58-840860a23f68 for this chassis.
Jan 20 09:54:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:16Z|00486|binding|INFO|87b0cab5-af2f-4440-8f58-840860a23f68: Claiming fa:16:3e:2b:79:1b 10.100.0.9
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.027 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:79:1b 10.100.0.9'], port_security=['fa:16:3e:2b:79:1b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7f5cfffe-c1dc-4b00-844e-0fb35b340f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '7', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=87b0cab5-af2f-4440-8f58-840860a23f68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.028 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 87b0cab5-af2f-4440-8f58-840860a23f68 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce bound to our chassis#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.030 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce#033[00m
Jan 20 09:54:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:16Z|00487|binding|INFO|Setting lport 87b0cab5-af2f-4440-8f58-840860a23f68 ovn-installed in OVS
Jan 20 09:54:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:16Z|00488|binding|INFO|Setting lport 87b0cab5-af2f-4440-8f58-840860a23f68 up in Southbound
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.041 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.042 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[958721f0-c9ba-4353-9501-417fe2b3eeb5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.043 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41a1a3fe-f1 in ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.045 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41a1a3fe-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.045 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[46350ddf-8847-457f-8183-496d1c49d5e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.046 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[006b67ab-eda9-408f-aec6-ca1a9cd515ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 systemd-udevd[273982]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:54:16 np0005588919 systemd-machined[194361]: New machine qemu-58-instance-00000076.
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.057 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[a6178877-9d60-4ee1-a876-ef67b643f231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 systemd[1]: Started Virtual Machine qemu-58-instance-00000076.
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.0680] device (tap87b0cab5-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.0689] device (tap87b0cab5-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.071 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[01f63726-547e-436d-b1f7-07ca1c00a171]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.099 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab78bd2-5f3c-4fbb-bbf2-246a441a6ec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[340a5090-9530-4919-8e81-d60f2166e7c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.1065] manager: (tap41a1a3fe-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.132 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ed067912-d3cc-4610-8e2d-5185435f0fcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.137 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[18a8f79f-1a2e-42a4-a2db-8c9523041839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.1576] device (tap41a1a3fe-f0): carrier: link connected
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.162 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[eca41d98-0f66-45a2-868c-a6dc16b7064c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a24041b6-fd92-4c74-afb6-5eb979e7d59e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589508, 'reachable_time': 17134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274014, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.195 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd08d5ae-f710-4051-be79-286107c2cd86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:1fb5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589508, 'tstamp': 589508}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274015, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.221 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9ddbbbec-7e4c-4fe7-933c-98fbc16728a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41a1a3fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:1f:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589508, 'reachable_time': 17134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274016, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.264 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[94e78398-dd6f-471a-b3c8-78b2c95ac3b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.352 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ae3c5e-98a7-40dc-91b6-bf37bad70919]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.354 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.354 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.355 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41a1a3fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.357 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 kernel: tap41a1a3fe-f0: entered promiscuous mode
Jan 20 09:54:16 np0005588919 NetworkManager[49104]: <info>  [1768920856.3578] manager: (tap41a1a3fe-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.359 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.361 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41a1a3fe-f0, col_values=(('external_ids', {'iface-id': '3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.363 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:16Z|00489|binding|INFO|Releasing lport 3fa2df7b-42b2-4a3b-a33b-ab37b5d6aef3 from this chassis (sb_readonly=0)
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.385 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.386 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.387 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[27612061-3bed-4d2b-a0cc-a86e440cbdcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.389 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.pid.haproxy
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.391 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'env', 'PROCESS_TAG=haproxy-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.414 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.415 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:16.415 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.524 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920856.524169, 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.525 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.527 225859 DEBUG nova.compute.manager [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.530 225859 INFO nova.virt.libvirt.driver [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance rebooted successfully.#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.531 225859 DEBUG nova.compute.manager [None req-1a6d3efd-f2d4-427c-8626-c07b97fcb723 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.564 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.567 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.596 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920856.524804, 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.597 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] VM Started (Lifecycle Event)#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.617 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.621 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:16 np0005588919 podman[274140]: 2026-01-20 14:54:16.794799333 +0000 UTC m=+0.074084442 container create ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:54:16 np0005588919 systemd[1]: Started libpod-conmon-ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6.scope.
Jan 20 09:54:16 np0005588919 podman[274140]: 2026-01-20 14:54:16.749553886 +0000 UTC m=+0.028839025 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:54:16 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:54:16 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaa4d503d18710d2bbff4fcb5c47904c975a3c0af22d0db4d6302ccf63c41716/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:54:16 np0005588919 podman[274140]: 2026-01-20 14:54:16.902460193 +0000 UTC m=+0.181745322 container init ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:54:16 np0005588919 podman[274140]: 2026-01-20 14:54:16.908266027 +0000 UTC m=+0.187551136 container start ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:54:16 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [NOTICE]   (274159) : New worker (274161) forked
Jan 20 09:54:16 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [NOTICE]   (274159) : Loading success.
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.988 225859 DEBUG nova.compute.manager [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.993 225859 DEBUG oslo_concurrency.lockutils [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.994 225859 DEBUG oslo_concurrency.lockutils [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.994 225859 DEBUG oslo_concurrency.lockutils [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.995 225859 DEBUG nova.compute.manager [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] No waiting events found dispatching network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:16 np0005588919 nova_compute[225855]: 2026-01-20 14:54:16.995 225859 WARNING nova.compute.manager [req-c2f495b2-12e5-4414-8bea-67d7b69a81dd req-e282b3e3-488a-4d54-96ff-639172249b94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received unexpected event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:54:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:17.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:17.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.103 225859 DEBUG nova.compute.manager [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.104 225859 DEBUG oslo_concurrency.lockutils [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.104 225859 DEBUG oslo_concurrency.lockutils [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.104 225859 DEBUG oslo_concurrency.lockutils [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.104 225859 DEBUG nova.compute.manager [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] No waiting events found dispatching network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.104 225859 WARNING nova.compute.manager [req-f6f8a8fe-2ec5-4e65-8b30-f233f36f55ea req-14c10c01-2471-48fc-8ee5-82c33b5f21b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received unexpected event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:19.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.375 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.376 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.376 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.376 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.377 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.378 225859 INFO nova.compute.manager [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Terminating instance#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.378 225859 DEBUG nova.compute.manager [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:54:19 np0005588919 kernel: tap87b0cab5-af (unregistering): left promiscuous mode
Jan 20 09:54:19 np0005588919 NetworkManager[49104]: <info>  [1768920859.4373] device (tap87b0cab5-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:54:19 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:19Z|00490|binding|INFO|Releasing lport 87b0cab5-af2f-4440-8f58-840860a23f68 from this chassis (sb_readonly=0)
Jan 20 09:54:19 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:19Z|00491|binding|INFO|Setting lport 87b0cab5-af2f-4440-8f58-840860a23f68 down in Southbound
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.446 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:19Z|00492|binding|INFO|Removing iface tap87b0cab5-af ovn-installed in OVS
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.448 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.452 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:79:1b 10.100.0.9'], port_security=['fa:16:3e:2b:79:1b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7f5cfffe-c1dc-4b00-844e-0fb35b340f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3b1b7f5b4f84b5abbc401eb577c85c0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8b11f3fb-2601-4eca-a1b6-838549d7750c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3273589e-5585-406c-9611-87f758b0e521, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=87b0cab5-af2f-4440-8f58-840860a23f68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.453 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 87b0cab5-af2f-4440-8f58-840860a23f68 in datapath 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce unbound from our chassis#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.455 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.456 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a4da95fc-b9c5-49b1-a375-7ec6f447ed13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.456 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce namespace which is not needed anymore#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.462 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 20 09:54:19 np0005588919 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000076.scope: Consumed 3.461s CPU time.
Jan 20 09:54:19 np0005588919 systemd-machined[194361]: Machine qemu-58-instance-00000076 terminated.
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [NOTICE]   (274159) : haproxy version is 2.8.14-c23fe91
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [NOTICE]   (274159) : path to executable is /usr/sbin/haproxy
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [WARNING]  (274159) : Exiting Master process...
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [WARNING]  (274159) : Exiting Master process...
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [ALERT]    (274159) : Current worker (274161) exited with code 143 (Terminated)
Jan 20 09:54:19 np0005588919 neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce[274155]: [WARNING]  (274159) : All workers exited. Exiting... (0)
Jan 20 09:54:19 np0005588919 systemd[1]: libpod-ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6.scope: Deactivated successfully.
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.612 225859 INFO nova.virt.libvirt.driver [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Instance destroyed successfully.#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.612 225859 DEBUG nova.objects.instance [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lazy-loading 'resources' on Instance uuid 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:19 np0005588919 podman[274194]: 2026-01-20 14:54:19.613066596 +0000 UTC m=+0.049763746 container died ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.630 225859 DEBUG nova.virt.libvirt.vif [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1654627482',display_name='tempest-ServerActionsTestOtherB-server-1654627482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1654627482',id=118,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO99uJ9+FwgjxRb/9u+f3Mj9/VKSDM+OKd66Ygsg8lEO+7bGpDEQrC5BIaSV+Na5YF+3DqUwLNmAYSN9IkTSGbRPw5y8813A+KsiNHebrpnZ7oReyT+5/zNQYafCHVAfGA==',key_name='tempest-keypair-302882914',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b3b1b7f5b4f84b5abbc401eb577c85c0',ramdisk_id='',reservation_id='r-2ulk0sfq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1136521362',owner_user_name='tempest-ServerActionsTestOtherB-1136521362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='215db37373dc4ae5a75cbd6866f471da',uuid=7f5cfffe-c1dc-4b00-844e-0fb35b340f44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.630 225859 DEBUG nova.network.os_vif_util [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converting VIF {"id": "87b0cab5-af2f-4440-8f58-840860a23f68", "address": "fa:16:3e:2b:79:1b", "network": {"id": "41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1445030024-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3b1b7f5b4f84b5abbc401eb577c85c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b0cab5-af", "ovs_interfaceid": "87b0cab5-af2f-4440-8f58-840860a23f68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.631 225859 DEBUG nova.network.os_vif_util [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.631 225859 DEBUG os_vif [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.633 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.634 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87b0cab5-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.635 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.640 225859 INFO os_vif [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:79:1b,bridge_name='br-int',has_traffic_filtering=True,id=87b0cab5-af2f-4440-8f58-840860a23f68,network=Network(41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b0cab5-af')#033[00m
Jan 20 09:54:19 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6-userdata-shm.mount: Deactivated successfully.
Jan 20 09:54:19 np0005588919 systemd[1]: var-lib-containers-storage-overlay-eaa4d503d18710d2bbff4fcb5c47904c975a3c0af22d0db4d6302ccf63c41716-merged.mount: Deactivated successfully.
Jan 20 09:54:19 np0005588919 podman[274194]: 2026-01-20 14:54:19.663449308 +0000 UTC m=+0.100146458 container cleanup ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:54:19 np0005588919 systemd[1]: libpod-conmon-ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6.scope: Deactivated successfully.
Jan 20 09:54:19 np0005588919 podman[274251]: 2026-01-20 14:54:19.72264817 +0000 UTC m=+0.039268320 container remove ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.729 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f44c2b58-cd80-460f-baa6-e63ea2fb9544]: (4, ('Tue Jan 20 02:54:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce (ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6)\nae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6\nTue Jan 20 02:54:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce (ae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6)\nae11b6978c1a5aa6f308a011c4a65dc64e867e3720c9cf1fb815e78bfe604ca6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.730 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34279aa6-2064-4b57-baf0-57e5617ed82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.731 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41a1a3fe-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:19 np0005588919 kernel: tap41a1a3fe-f0: left promiscuous mode
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[47b49516-7391-4e3f-aa82-715c86eeb103]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.771 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b3bb2dd5-2149-4a1b-88fe-a8a5edaf99fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.772 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9395c1-749c-4dcc-b84b-b901761a6d0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.787 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dc40f105-0fe0-4189-a3f4-b91ac052f1c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589502, 'reachable_time': 29432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274270, 'error': None, 'target': 'ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.790 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41a1a3fe-f6f8-4375-9b0f-a4d4bb269cce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:54:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:19.790 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3ca517-fbf9-4d7f-bfe5-023be5259935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:19 np0005588919 systemd[1]: run-netns-ovnmeta\x2d41a1a3fe\x2df6f8\x2d4375\x2d9b0f\x2da4d4bb269cce.mount: Deactivated successfully.
Jan 20 09:54:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:19.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.992 225859 INFO nova.virt.libvirt.driver [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Deleting instance files /var/lib/nova/instances/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_del#033[00m
Jan 20 09:54:19 np0005588919 nova_compute[225855]: 2026-01-20 14:54:19.992 225859 INFO nova.virt.libvirt.driver [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Deletion of /var/lib/nova/instances/7f5cfffe-c1dc-4b00-844e-0fb35b340f44_del complete#033[00m
Jan 20 09:54:20 np0005588919 nova_compute[225855]: 2026-01-20 14:54:20.070 225859 INFO nova.compute.manager [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:54:20 np0005588919 nova_compute[225855]: 2026-01-20 14:54:20.071 225859 DEBUG oslo.service.loopingcall [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:54:20 np0005588919 nova_compute[225855]: 2026-01-20 14:54:20.071 225859 DEBUG nova.compute.manager [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:54:20 np0005588919 nova_compute[225855]: 2026-01-20 14:54:20.071 225859 DEBUG nova.network.neutron [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.004 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.227 225859 DEBUG nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-unplugged-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.227 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.228 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.228 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.228 225859 DEBUG nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] No waiting events found dispatching network-vif-unplugged-87b0cab5-af2f-4440-8f58-840860a23f68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.228 225859 DEBUG nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-unplugged-87b0cab5-af2f-4440-8f58-840860a23f68 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.229 225859 DEBUG nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.229 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.229 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.229 225859 DEBUG oslo_concurrency.lockutils [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.229 225859 DEBUG nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] No waiting events found dispatching network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:21 np0005588919 nova_compute[225855]: 2026-01-20 14:54:21.230 225859 WARNING nova.compute.manager [req-b8f65186-16bf-4916-b8dc-beedcc7cf471 req-4088d856-e670-4a1f-a597-18bfc264fe5b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received unexpected event network-vif-plugged-87b0cab5-af2f-4440-8f58-840860a23f68 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:54:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:21.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:21.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.391 225859 DEBUG nova.network.neutron [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.410 225859 INFO nova.compute.manager [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Took 2.34 seconds to deallocate network for instance.#033[00m
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.462 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.463 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.468 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.502 225859 INFO nova.scheduler.client.report [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Deleted allocations for instance 7f5cfffe-c1dc-4b00-844e-0fb35b340f44#033[00m
Jan 20 09:54:22 np0005588919 nova_compute[225855]: 2026-01-20 14:54:22.593 225859 DEBUG oslo_concurrency.lockutils [None req-a3b2f09e-4532-4621-bb52-2dd21ee746cc 215db37373dc4ae5a75cbd6866f471da b3b1b7f5b4f84b5abbc401eb577c85c0 - - default default] Lock "7f5cfffe-c1dc-4b00-844e-0fb35b340f44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:23 np0005588919 podman[274273]: 2026-01-20 14:54:23.029826614 +0000 UTC m=+0.080287887 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:54:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:23.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:23 np0005588919 nova_compute[225855]: 2026-01-20 14:54:23.313 225859 DEBUG nova.compute.manager [req-93243fb2-0718-4658-a305-dc5f4bfb7e48 req-0ac5592a-5584-48c5-8da2-56cd28b0106a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Received event network-vif-deleted-87b0cab5-af2f-4440-8f58-840860a23f68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:54:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:23.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:54:24 np0005588919 nova_compute[225855]: 2026-01-20 14:54:24.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 20 09:54:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:25.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:25.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:26 np0005588919 nova_compute[225855]: 2026-01-20 14:54:26.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:27.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:27.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:29 np0005588919 nova_compute[225855]: 2026-01-20 14:54:29.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:29.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.744 225859 DEBUG oslo_concurrency.lockutils [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.745 225859 DEBUG oslo_concurrency.lockutils [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.793 225859 INFO nova.compute.manager [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Detaching volume 5f6a803f-d232-4e97-9965-ece0139e0fda#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.931 225859 INFO nova.virt.block_device [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Attempting to driver detach volume 5f6a803f-d232-4e97-9965-ece0139e0fda from mountpoint /dev/vdb#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.944 225859 DEBUG nova.virt.libvirt.driver [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Attempting to detach device vdb from instance baada610-f563-4c97-89a9-56eba792c352 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.945 225859 DEBUG nova.virt.libvirt.guest [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-5f6a803f-d232-4e97-9965-ece0139e0fda">
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <serial>5f6a803f-d232-4e97-9965-ece0139e0fda</serial>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:54:30 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.976 225859 INFO nova.virt.libvirt.driver [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully detached device vdb from instance baada610-f563-4c97-89a9-56eba792c352 from the persistent domain config.#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.977 225859 DEBUG nova.virt.libvirt.driver [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance baada610-f563-4c97-89a9-56eba792c352 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:54:30 np0005588919 nova_compute[225855]: 2026-01-20 14:54:30.978 225859 DEBUG nova.virt.libvirt.guest [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-5f6a803f-d232-4e97-9965-ece0139e0fda">
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <serial>5f6a803f-d232-4e97-9965-ece0139e0fda</serial>
Jan 20 09:54:30 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:54:30 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:54:30 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.009 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.102 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768920871.1023562, baada610-f563-4c97-89a9-56eba792c352 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.105 225859 DEBUG nova.virt.libvirt.driver [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance baada610-f563-4c97-89a9-56eba792c352 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.108 225859 INFO nova.virt.libvirt.driver [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully detached device vdb from instance baada610-f563-4c97-89a9-56eba792c352 from the live domain config.#033[00m
Jan 20 09:54:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:54:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:31.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.361 225859 DEBUG nova.objects.instance [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'flavor' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:31 np0005588919 nova_compute[225855]: 2026-01-20 14:54:31.398 225859 DEBUG oslo_concurrency.lockutils [None req-99cc8d1c-f01c-478b-a905-fe262b6711a1 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:31.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:32 np0005588919 nova_compute[225855]: 2026-01-20 14:54:32.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:32.090 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:32.091 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:54:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 20 09:54:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:33.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.420 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.421 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.421 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.422 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.422 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.423 225859 INFO nova.compute.manager [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Terminating instance#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.424 225859 DEBUG nova.compute.manager [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:54:33 np0005588919 kernel: tapa3156414-5a (unregistering): left promiscuous mode
Jan 20 09:54:33 np0005588919 NetworkManager[49104]: <info>  [1768920873.4843] device (tapa3156414-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:54:33 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:33Z|00493|binding|INFO|Releasing lport a3156414-5a96-462d-974e-a57c9cd8e9c8 from this chassis (sb_readonly=0)
Jan 20 09:54:33 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:33Z|00494|binding|INFO|Setting lport a3156414-5a96-462d-974e-a57c9cd8e9c8 down in Southbound
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:33Z|00495|binding|INFO|Removing iface tapa3156414-5a ovn-installed in OVS
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.497 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:93:82 10.100.0.3'], port_security=['fa:16:3e:9e:93:82 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'baada610-f563-4c97-89a9-56eba792c352', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '34326e47-c07e-48d1-9283-c1c5634fdc52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=a3156414-5a96-462d-974e-a57c9cd8e9c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.498 140354 INFO neutron.agent.ovn.metadata.agent [-] Port a3156414-5a96-462d-974e-a57c9cd8e9c8 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.500 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 79184781-1f23-4584-87de-08e262242488#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.507 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.516 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd75bfa-13db-455b-a901-afdf6b7db496]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.540 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[71cba883-287a-49da-85a3-7404fc610648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.543 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe4eb73-9871-4f3a-881b-d6ef4e2abfb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 20 09:54:33 np0005588919 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000077.scope: Consumed 14.552s CPU time.
Jan 20 09:54:33 np0005588919 systemd-machined[194361]: Machine qemu-57-instance-00000077 terminated.
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.575 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[811da5f0-f7f9-4e0a-9288-49906ac68906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.590 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c49d17df-f81d-4a67-b98a-51dfe19cc1bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap79184781-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:7c:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 20, 'rx_bytes': 784, 'tx_bytes': 1028, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 20, 'rx_bytes': 784, 'tx_bytes': 1028, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580405, 'reachable_time': 23111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274361, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.604 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[497b7695-a3e2-4be7-8be8-81ea51549c47]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580414, 'tstamp': 580414}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274362, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap79184781-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580417, 'tstamp': 580417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274362, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.606 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.607 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.611 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.612 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79184781-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.613 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.613 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap79184781-10, col_values=(('external_ids', {'iface-id': 'b033e9e6-9781-4424-a20f-7b48a14e2c80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:33.613 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.647 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.660 225859 INFO nova.virt.libvirt.driver [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Instance destroyed successfully.#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.661 225859 DEBUG nova.objects.instance [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid baada610-f563-4c97-89a9-56eba792c352 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.687 225859 DEBUG nova.virt.libvirt.vif [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-185388239',display_name='tempest-ServerStableDeviceRescueTest-server-185388239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-185388239',id=119,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIfbHk5SMnBEAUVZEhoLPfdB2qCay341zK720hYW5qflxdgcEr+fHp9C3kAgJFmqON8wn8DkPxW0WmihyCLPTK7Iiiy5VDiRJ7U/0O7hlyzm17ZWhCVdPfXSugKxmeVL3w==',key_name='tempest-keypair-647310408',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:53:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-d9074tfl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:54:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=baada610-f563-4c97-89a9-56eba792c352,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.688 225859 DEBUG nova.network.os_vif_util [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "address": "fa:16:3e:9e:93:82", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3156414-5a", "ovs_interfaceid": "a3156414-5a96-462d-974e-a57c9cd8e9c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.688 225859 DEBUG nova.network.os_vif_util [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.688 225859 DEBUG os_vif [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.689 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.690 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3156414-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.691 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.693 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.695 225859 INFO os_vif [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9e:93:82,bridge_name='br-int',has_traffic_filtering=True,id=a3156414-5a96-462d-974e-a57c9cd8e9c8,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3156414-5a')#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.724 225859 DEBUG nova.compute.manager [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.725 225859 DEBUG oslo_concurrency.lockutils [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.725 225859 DEBUG oslo_concurrency.lockutils [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.725 225859 DEBUG oslo_concurrency.lockutils [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.726 225859 DEBUG nova.compute.manager [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:33 np0005588919 nova_compute[225855]: 2026-01-20 14:54:33.726 225859 DEBUG nova.compute.manager [req-fd293b95-32ff-41d2-9d47-2f769ff01830 req-11317e23-ea25-48bb-adde-c4efbcfbc12d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-unplugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:54:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:33.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.611 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920859.6104903, 7f5cfffe-c1dc-4b00-844e-0fb35b340f44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.611 225859 INFO nova.compute.manager [-] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.641 225859 DEBUG nova.compute.manager [None req-cadf0971-3b18-4893-be0e-a7e7d3917b2e - - - - - -] [instance: 7f5cfffe-c1dc-4b00-844e-0fb35b340f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.842 225859 INFO nova.virt.libvirt.driver [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Deleting instance files /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352_del#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.843 225859 INFO nova.virt.libvirt.driver [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Deletion of /var/lib/nova/instances/baada610-f563-4c97-89a9-56eba792c352_del complete#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.907 225859 INFO nova.compute.manager [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Took 1.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.907 225859 DEBUG oslo.service.loopingcall [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.908 225859 DEBUG nova.compute.manager [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:54:34 np0005588919 nova_compute[225855]: 2026-01-20 14:54:34.908 225859 DEBUG nova.network.neutron [-] [instance: baada610-f563-4c97-89a9-56eba792c352] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:54:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:35.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.358 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.833 225859 DEBUG nova.compute.manager [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.833 225859 DEBUG oslo_concurrency.lockutils [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "baada610-f563-4c97-89a9-56eba792c352-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.835 225859 DEBUG oslo_concurrency.lockutils [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.835 225859 DEBUG oslo_concurrency.lockutils [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.836 225859 DEBUG nova.compute.manager [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] No waiting events found dispatching network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.836 225859 WARNING nova.compute.manager [req-251f362a-c134-423d-9f27-84c52c55eee7 req-39d92fff-67ca-4b3d-9f99-e94bd4c812ca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received unexpected event network-vif-plugged-a3156414-5a96-462d-974e-a57c9cd8e9c8 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.838 225859 DEBUG nova.network.neutron [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.861 225859 INFO nova.compute.manager [-] [instance: baada610-f563-4c97-89a9-56eba792c352] Took 0.95 seconds to deallocate network for instance.#033[00m
Jan 20 09:54:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:35.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.920 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.920 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.923 225859 DEBUG nova.compute.manager [req-6d5fd803-0264-45d4-a928-544ee3377e22 req-e065ff20-b221-4bf6-afba-954dd9e53c95 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: baada610-f563-4c97-89a9-56eba792c352] Received event network-vif-deleted-a3156414-5a96-462d-974e-a57c9cd8e9c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:35 np0005588919 nova_compute[225855]: 2026-01-20 14:54:35.999 225859 DEBUG oslo_concurrency.processutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2926708069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.454 225859 DEBUG oslo_concurrency.processutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.464 225859 DEBUG nova.compute.provider_tree [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.484 225859 DEBUG nova.scheduler.client.report [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.517 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.562 225859 INFO nova.scheduler.client.report [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Deleted allocations for instance baada610-f563-4c97-89a9-56eba792c352#033[00m
Jan 20 09:54:36 np0005588919 nova_compute[225855]: 2026-01-20 14:54:36.632 225859 DEBUG oslo_concurrency.lockutils [None req-3f3df2db-d1c2-4077-aa71-3352e0002504 d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "baada610-f563-4c97-89a9-56eba792c352" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:37.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:37 np0005588919 nova_compute[225855]: 2026-01-20 14:54:37.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:37 np0005588919 nova_compute[225855]: 2026-01-20 14:54:37.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:54:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:37.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:38 np0005588919 nova_compute[225855]: 2026-01-20 14:54:38.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:38 np0005588919 nova_compute[225855]: 2026-01-20 14:54:38.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:54:38 np0005588919 nova_compute[225855]: 2026-01-20 14:54:38.366 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:54:38 np0005588919 nova_compute[225855]: 2026-01-20 14:54:38.691 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:39.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 20 09:54:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:39.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:40 np0005588919 nova_compute[225855]: 2026-01-20 14:54:40.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.733262) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880733295, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 673, "num_deletes": 255, "total_data_size": 1044116, "memory_usage": 1058016, "flush_reason": "Manual Compaction"}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880742964, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 687669, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49069, "largest_seqno": 49737, "table_properties": {"data_size": 684234, "index_size": 1279, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8344, "raw_average_key_size": 20, "raw_value_size": 677213, "raw_average_value_size": 1631, "num_data_blocks": 55, "num_entries": 415, "num_filter_entries": 415, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920850, "oldest_key_time": 1768920850, "file_creation_time": 1768920880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 9736 microseconds, and 2819 cpu microseconds.
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.742998) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 687669 bytes OK
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.743016) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747612) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747635) EVENT_LOG_v1 {"time_micros": 1768920880747628, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.747656) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 1040362, prev total WAL file size 1040362, number of live WAL files 2.
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.748361) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(671KB)], [93(13MB)]
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880748424, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14320766, "oldest_snapshot_seqno": -1}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7487 keys, 12472866 bytes, temperature: kUnknown
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880920780, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 12472866, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12420540, "index_size": 32502, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18757, "raw_key_size": 193754, "raw_average_key_size": 25, "raw_value_size": 12284471, "raw_average_value_size": 1640, "num_data_blocks": 1286, "num_entries": 7487, "num_filter_entries": 7487, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.921061) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 12472866 bytes
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.923095) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.0 rd, 72.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 13.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(39.0) write-amplify(18.1) OK, records in: 8011, records dropped: 524 output_compression: NoCompression
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.923135) EVENT_LOG_v1 {"time_micros": 1768920880923120, "job": 58, "event": "compaction_finished", "compaction_time_micros": 172476, "compaction_time_cpu_micros": 29016, "output_level": 6, "num_output_files": 1, "total_output_size": 12472866, "num_input_records": 8011, "num_output_records": 7487, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880923546, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920880925813, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.748183) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.925886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.925893) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.925896) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.925899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:54:40.925902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:41.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.552 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.553 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.553 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.553 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.553 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.554 225859 INFO nova.compute.manager [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Terminating instance#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.555 225859 DEBUG nova.compute.manager [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:54:41 np0005588919 kernel: tap234381ea-07 (unregistering): left promiscuous mode
Jan 20 09:54:41 np0005588919 NetworkManager[49104]: <info>  [1768920881.8917] device (tap234381ea-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:54:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:41Z|00496|binding|INFO|Releasing lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 from this chassis (sb_readonly=0)
Jan 20 09:54:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:41Z|00497|binding|INFO|Setting lport 234381ea-07b1-41fe-b3c1-be97ce6a3b64 down in Southbound
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.904 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:41 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:41Z|00498|binding|INFO|Removing iface tap234381ea-07 ovn-installed in OVS
Jan 20 09:54:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:41.910 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:55:3c 10.100.0.14'], port_security=['fa:16:3e:b5:55:3c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '23ea4537-f03f-46de-881f-b979e232a3b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79184781-1f23-4584-87de-08e262242488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a29915e0dd2403fbd7b7e847696b00a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '30ec24b7-15ba-4aeb-9785-539071729f77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b73ab05-b29f-401a-84a5-ea1a96103f33, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=234381ea-07b1-41fe-b3c1-be97ce6a3b64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:41.911 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 234381ea-07b1-41fe-b3c1-be97ce6a3b64 in datapath 79184781-1f23-4584-87de-08e262242488 unbound from our chassis#033[00m
Jan 20 09:54:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:41.912 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 79184781-1f23-4584-87de-08e262242488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:54:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:41.913 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[926ab9d8-3bf8-41e6-bbff-8af2cbb80c04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:41.914 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-79184781-1f23-4584-87de-08e262242488 namespace which is not needed anymore#033[00m
Jan 20 09:54:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:54:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:41.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.934 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:41 np0005588919 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 20 09:54:41 np0005588919 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000075.scope: Consumed 17.674s CPU time.
Jan 20 09:54:41 np0005588919 systemd-machined[194361]: Machine qemu-54-instance-00000075 terminated.
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.992 225859 INFO nova.virt.libvirt.driver [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Instance destroyed successfully.#033[00m
Jan 20 09:54:41 np0005588919 nova_compute[225855]: 2026-01-20 14:54:41.993 225859 DEBUG nova.objects.instance [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lazy-loading 'resources' on Instance uuid 23ea4537-f03f-46de-881f-b979e232a3b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.010 225859 DEBUG nova.virt.libvirt.vif [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:51:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-828759404',display_name='tempest-ServerStableDeviceRescueTest-server-828759404',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-828759404',id=117,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:52:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a29915e0dd2403fbd7b7e847696b00a',ramdisk_id='',reservation_id='r-ztpzn050',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-129078052',owner_user_name='tempest-ServerStableDeviceRescueTest-129078052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:52:46Z,user_data=None,user_id='d85d286ce6224326a0f4a15a06afbfea',uuid=23ea4537-f03f-46de-881f-b979e232a3b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.010 225859 DEBUG nova.network.os_vif_util [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converting VIF {"id": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "address": "fa:16:3e:b5:55:3c", "network": {"id": "79184781-1f23-4584-87de-08e262242488", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-165460946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a29915e0dd2403fbd7b7e847696b00a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap234381ea-07", "ovs_interfaceid": "234381ea-07b1-41fe-b3c1-be97ce6a3b64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.011 225859 DEBUG nova.network.os_vif_util [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.012 225859 DEBUG os_vif [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.013 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.013 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap234381ea-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.018 225859 INFO os_vif [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:55:3c,bridge_name='br-int',has_traffic_filtering=True,id=234381ea-07b1-41fe-b3c1-be97ce6a3b64,network=Network(79184781-1f23-4584-87de-08e262242488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap234381ea-07')#033[00m
Jan 20 09:54:42 np0005588919 podman[274420]: 2026-01-20 14:54:42.021621054 +0000 UTC m=+0.103108481 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 09:54:42 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [NOTICE]   (271782) : haproxy version is 2.8.14-c23fe91
Jan 20 09:54:42 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [NOTICE]   (271782) : path to executable is /usr/sbin/haproxy
Jan 20 09:54:42 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [WARNING]  (271782) : Exiting Master process...
Jan 20 09:54:42 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [ALERT]    (271782) : Current worker (271784) exited with code 143 (Terminated)
Jan 20 09:54:42 np0005588919 neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488[271777]: [WARNING]  (271782) : All workers exited. Exiting... (0)
Jan 20 09:54:42 np0005588919 systemd[1]: libpod-8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f.scope: Deactivated successfully.
Jan 20 09:54:42 np0005588919 podman[274480]: 2026-01-20 14:54:42.065075971 +0000 UTC m=+0.047386789 container died 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 09:54:42 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f-userdata-shm.mount: Deactivated successfully.
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.093 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:42 np0005588919 systemd[1]: var-lib-containers-storage-overlay-31b783ff1c4e23d189e5eb541d582f8638c64e811963a516b7ed73c8d8eafb4a-merged.mount: Deactivated successfully.
Jan 20 09:54:42 np0005588919 podman[274480]: 2026-01-20 14:54:42.116439931 +0000 UTC m=+0.098750749 container cleanup 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:54:42 np0005588919 systemd[1]: libpod-conmon-8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f.scope: Deactivated successfully.
Jan 20 09:54:42 np0005588919 podman[274529]: 2026-01-20 14:54:42.194490175 +0000 UTC m=+0.056544607 container remove 8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.201 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b626fc-d490-4496-9656-e6a64239e5e4]: (4, ('Tue Jan 20 02:54:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f)\n8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f\nTue Jan 20 02:54:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-79184781-1f23-4584-87de-08e262242488 (8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f)\n8659aeb836f14239190951253b93a3be94364fbe5edb0e5964485303d9ef695f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.203 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[87da468b-1f92-4fd1-a45c-110e8170ff98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.204 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79184781-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:42 np0005588919 kernel: tap79184781-10: left promiscuous mode
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.223 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.226 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d46a20ab-ded7-4dfc-aa32-90e87eefd2e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.240 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eefc6a25-33b6-42b2-8b74-fd46957483d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.241 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbb4a93-f1e5-4c7d-9c8d-7ebf1b08a49a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.255 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0f49cb-44c0-41f9-9eef-3499e27884d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580397, 'reachable_time': 26164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274545, 'error': None, 'target': 'ovnmeta-79184781-1f23-4584-87de-08e262242488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 systemd[1]: run-netns-ovnmeta\x2d79184781\x2d1f23\x2d4584\x2d87de\x2d08e262242488.mount: Deactivated successfully.
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.258 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-79184781-1f23-4584-87de-08e262242488 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:54:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:42.259 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ff579ff8-de19-4932-aeff-c4d67b7d6ae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:54:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3510984213' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:54:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:54:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3510984213' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.412 225859 INFO nova.virt.libvirt.driver [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Deleting instance files /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9_del#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.413 225859 INFO nova.virt.libvirt.driver [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Deletion of /var/lib/nova/instances/23ea4537-f03f-46de-881f-b979e232a3b9_del complete#033[00m
Jan 20 09:54:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.786 225859 DEBUG nova.compute.manager [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.786 225859 DEBUG oslo_concurrency.lockutils [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.787 225859 DEBUG oslo_concurrency.lockutils [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.787 225859 DEBUG oslo_concurrency.lockutils [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.787 225859 DEBUG nova.compute.manager [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.787 225859 DEBUG nova.compute.manager [req-c7ca7af7-4363-4e44-8137-859a60cb36a7 req-608ed09f-e9a1-4846-8b0d-5351fd567649 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-unplugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.824 225859 INFO nova.compute.manager [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Took 1.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.824 225859 DEBUG oslo.service.loopingcall [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.825 225859 DEBUG nova.compute.manager [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:54:42 np0005588919 nova_compute[225855]: 2026-01-20 14:54:42.825 225859 DEBUG nova.network.neutron [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:54:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:43.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:43 np0005588919 nova_compute[225855]: 2026-01-20 14:54:43.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:43.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.116 225859 DEBUG nova.network.neutron [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.147 225859 INFO nova.compute.manager [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Took 1.32 seconds to deallocate network for instance.#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.222 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.223 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.325 225859 DEBUG oslo_concurrency.processutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.348 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.376 225859 DEBUG nova.compute.manager [req-b1259cbe-5b70-4c4b-877c-95383d195375 req-406ac6f4-8918-4594-a38b-a333ac629e5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-deleted-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/510416413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.755 225859 DEBUG oslo_concurrency.processutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.763 225859 DEBUG nova.compute.provider_tree [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.783 225859 DEBUG nova.scheduler.client.report [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.805 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.807 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.808 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.808 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.808 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.855 225859 INFO nova.scheduler.client.report [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Deleted allocations for instance 23ea4537-f03f-46de-881f-b979e232a3b9#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.886 225859 DEBUG nova.compute.manager [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.886 225859 DEBUG oslo_concurrency.lockutils [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.887 225859 DEBUG oslo_concurrency.lockutils [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.887 225859 DEBUG oslo_concurrency.lockutils [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.887 225859 DEBUG nova.compute.manager [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] No waiting events found dispatching network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.887 225859 WARNING nova.compute.manager [req-ce3ee162-8fa6-459d-92a4-9b5dc933c7b9 req-66295c60-cafc-4918-9ddd-b2cae0cf971a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Received unexpected event network-vif-plugged-234381ea-07b1-41fe-b3c1-be97ce6a3b64 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:54:44 np0005588919 nova_compute[225855]: 2026-01-20 14:54:44.925 225859 DEBUG oslo_concurrency.lockutils [None req-b4971f7c-3726-4f85-af64-620f9ac7eebf d85d286ce6224326a0f4a15a06afbfea 0a29915e0dd2403fbd7b7e847696b00a - - default default] Lock "23ea4537-f03f-46de-881f-b979e232a3b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2105245927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.225 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:45.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.388 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.390 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4334MB free_disk=20.78514862060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.460 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.461 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.475 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/802906552' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.906 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:45 np0005588919 nova_compute[225855]: 2026-01-20 14:54:45.912 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:54:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:45.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:54:46 np0005588919 nova_compute[225855]: 2026-01-20 14:54:46.037 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:46 np0005588919 nova_compute[225855]: 2026-01-20 14:54:46.164 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:46 np0005588919 nova_compute[225855]: 2026-01-20 14:54:46.309 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:54:46 np0005588919 nova_compute[225855]: 2026-01-20 14:54:46.310 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:47 np0005588919 nova_compute[225855]: 2026-01-20 14:54:47.015 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:47.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:47 np0005588919 nova_compute[225855]: 2026-01-20 14:54:47.382 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:47 np0005588919 nova_compute[225855]: 2026-01-20 14:54:47.613 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 20 09:54:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:47.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.229 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.229 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.244 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.353 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.353 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.360 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.361 225859 INFO nova.compute.claims [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.482 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.659 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920873.6579804, baada610-f563-4c97-89a9-56eba792c352 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.660 225859 INFO nova.compute.manager [-] [instance: baada610-f563-4c97-89a9-56eba792c352] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:54:48 np0005588919 nova_compute[225855]: 2026-01-20 14:54:48.684 225859 DEBUG nova.compute.manager [None req-4781a04f-4c5e-4083-8f9b-d9b314b836d4 - - - - - -] [instance: baada610-f563-4c97-89a9-56eba792c352] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:54:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4106358672' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.001 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.009 225859 DEBUG nova.compute.provider_tree [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.028 225859 DEBUG nova.scheduler.client.report [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.050 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.051 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.102 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.103 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.145 225859 INFO nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.173 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:54:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:49.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.313 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.315 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.315 225859 INFO nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating image(s)#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.346 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.376 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.405 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.409 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.495 225859 DEBUG nova.policy [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34eb73f628994c11801d447148d5f142', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1e83af992c94112a965575784639d77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.499 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.499 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.500 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.500 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.527 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.531 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.816 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:49 np0005588919 nova_compute[225855]: 2026-01-20 14:54:49.922 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] resizing rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:54:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:49.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.067 225859 DEBUG nova.objects.instance [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.082 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.083 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Ensure instance console log exists: /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.083 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.084 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.084 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.303 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.303 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:50 np0005588919 nova_compute[225855]: 2026-01-20 14:54:50.497 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Successfully created port: e084df8c-a73e-4535-bcf7-de8adbafa9ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.038 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:51.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.474 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Successfully updated port: e084df8c-a73e-4535-bcf7-de8adbafa9ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.487 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.488 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.488 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.610 225859 DEBUG nova.compute.manager [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.611 225859 DEBUG nova.compute.manager [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing instance network info cache due to event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.611 225859 DEBUG oslo_concurrency.lockutils [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:54:51 np0005588919 nova_compute[225855]: 2026-01-20 14:54:51.799 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:54:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:51.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.017 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.697 225859 DEBUG nova.network.neutron [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.749 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.750 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance network_info: |[{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.750 225859 DEBUG oslo_concurrency.lockutils [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.750 225859 DEBUG nova.network.neutron [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.753 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start _get_guest_xml network_info=[{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.757 225859 WARNING nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.760 225859 DEBUG nova.virt.libvirt.host [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.761 225859 DEBUG nova.virt.libvirt.host [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.764 225859 DEBUG nova.virt.libvirt.host [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.764 225859 DEBUG nova.virt.libvirt.host [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.765 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.765 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.766 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.766 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.766 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.766 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.767 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.767 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.767 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.767 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.768 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.768 225859 DEBUG nova.virt.hardware [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:54:52 np0005588919 nova_compute[225855]: 2026-01-20 14:54:52.770 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 20 09:54:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/805911879' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.209 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.241 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.245 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:53.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:54:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/951443556' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.658 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.660 225859 DEBUG nova.virt.libvirt.vif [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEiICz3SRHAbg35RWQTkKKcPH7nuzl556rWhnnJMKWCQeHRtCTrp3rh0Cew3QLmsFdOqe88XbxeaMKtgT6L6nfvjZZnoyEjqVogiPNh8/V6NYBD5v71aQZWpX0o+tqsUvg==',key_name='tempest-keypair-1388890452',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:54:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.661 225859 DEBUG nova.network.os_vif_util [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.662 225859 DEBUG nova.network.os_vif_util [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.663 225859 DEBUG nova.objects.instance [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.685 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <uuid>3bec73f6-5255-44c0-8a10-a64c7e86c0c2</uuid>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <name>instance-0000007c</name>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-913712707</nova:name>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:54:52</nova:creationTime>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:user uuid="34eb73f628994c11801d447148d5f142">tempest-AttachVolumeShelveTestJSON-896995479-project-member</nova:user>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:project uuid="b1e83af992c94112a965575784639d77">tempest-AttachVolumeShelveTestJSON-896995479</nova:project>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <nova:port uuid="e084df8c-a73e-4535-bcf7-de8adbafa9ae">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="serial">3bec73f6-5255-44c0-8a10-a64c7e86c0c2</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="uuid">3bec73f6-5255-44c0-8a10-a64c7e86c0c2</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:89:8e:0f"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <target dev="tape084df8c-a7"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/console.log" append="off"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:54:53 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:54:53 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:54:53 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:54:53 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.686 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Preparing to wait for external event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.687 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.687 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.687 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.688 225859 DEBUG nova.virt.libvirt.vif [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEiICz3SRHAbg35RWQTkKKcPH7nuzl556rWhnnJMKWCQeHRtCTrp3rh0Cew3QLmsFdOqe88XbxeaMKtgT6L6nfvjZZnoyEjqVogiPNh8/V6NYBD5v71aQZWpX0o+tqsUvg==',key_name='tempest-keypair-1388890452',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:54:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.688 225859 DEBUG nova.network.os_vif_util [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.689 225859 DEBUG nova.network.os_vif_util [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.689 225859 DEBUG os_vif [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.690 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.690 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.691 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.693 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.693 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape084df8c-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.694 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape084df8c-a7, col_values=(('external_ids', {'iface-id': 'e084df8c-a73e-4535-bcf7-de8adbafa9ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:8e:0f', 'vm-uuid': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.695 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:53 np0005588919 NetworkManager[49104]: <info>  [1768920893.6963] manager: (tape084df8c-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.697 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.701 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.702 225859 INFO os_vif [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7')#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.743 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.743 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.743 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No VIF found with MAC fa:16:3e:89:8e:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.744 225859 INFO nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Using config drive#033[00m
Jan 20 09:54:53 np0005588919 nova_compute[225855]: 2026-01-20 14:54:53.768 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:53.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:54 np0005588919 podman[274940]: 2026-01-20 14:54:54.018555752 +0000 UTC m=+0.067291551 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.213 225859 INFO nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating config drive at /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.218 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsek0poxk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.242 225859 DEBUG nova.network.neutron [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updated VIF entry in instance network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.243 225859 DEBUG nova.network.neutron [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.274 225859 DEBUG oslo_concurrency.lockutils [req-a0ae1716-053e-47ca-b495-eb9b88b1145b req-d0ad84e5-2db1-416b-804b-8cb2900ef5af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:54:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:55.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.349 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsek0poxk" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.382 225859 DEBUG nova.storage.rbd_utils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.385 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.533 225859 DEBUG oslo_concurrency.processutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.534 225859 INFO nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deleting local config drive /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config because it was imported into RBD.#033[00m
Jan 20 09:54:55 np0005588919 virtqemud[225396]: End of file while reading data: Input/output error
Jan 20 09:54:55 np0005588919 kernel: tape084df8c-a7: entered promiscuous mode
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.5853] manager: (tape084df8c-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/210)
Jan 20 09:54:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:55Z|00499|binding|INFO|Claiming lport e084df8c-a73e-4535-bcf7-de8adbafa9ae for this chassis.
Jan 20 09:54:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:55Z|00500|binding|INFO|e084df8c-a73e-4535-bcf7-de8adbafa9ae: Claiming fa:16:3e:89:8e:0f 10.100.0.14
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.596 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:8e:0f 10.100.0.14'], port_security=['fa:16:3e:89:8e:0f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e88a960e-8540-4c69-934d-b6e1b91beb98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e084df8c-a73e-4535-bcf7-de8adbafa9ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.597 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e084df8c-a73e-4535-bcf7-de8adbafa9ae in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 bound to our chassis#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.598 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9589011-b728-4b79-9945-aa6c52dd0fc2#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.608 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9afb37f3-e8e8-4297-afbb-a26cb238fd60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.609 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9589011-b1 in ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.611 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9589011-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.611 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[37f023f9-b507-4187-8d19-b595205483e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.612 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a4a3d9-673c-46de-a494-4451daa149b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 systemd-udevd[275016]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:54:55 np0005588919 systemd-machined[194361]: New machine qemu-59-instance-0000007c.
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.625 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[65de640c-280e-473e-9bb2-b2f346a7877f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.6356] device (tape084df8c-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.6368] device (tape084df8c-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.649 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf1464b-79dc-4385-a33d-3504357c40e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 systemd[1]: Started Virtual Machine qemu-59-instance-0000007c.
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.664 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.671 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:55Z|00501|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae ovn-installed in OVS
Jan 20 09:54:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:55Z|00502|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae up in Southbound
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.677 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.678 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[016674b5-8647-4c0f-8a60-c1b82fa6017c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.682 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcdd49a-c257-4b59-be90-a7b61f7567ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 systemd-udevd[275019]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.6839] manager: (tape9589011-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.710 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[20523d66-fd07-42ae-958c-924bc782af68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.713 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e06ee447-1b1b-4da3-9a1d-920365902c83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.7319] device (tape9589011-b0): carrier: link connected
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.738 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e7683c38-385a-4e60-a3a8-6a893a6ea556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.754 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1282c7db-9524-4d83-8c68-19f0f1cb537e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593465, 'reachable_time': 18563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275047, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.766 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[069d19b3-a3b1-48ed-9827-6fde3e6af81d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:5a14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593465, 'tstamp': 593465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275048, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.785 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[290b2ff3-aed3-440c-b0bd-344beb009f91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593465, 'reachable_time': 18563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275049, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.808 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2ad23ec9-5792-486b-a7d4-594fc2fdb527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.875 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3961ac24-8cf7-453f-8e7d-dae7b6c6a59c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.876 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.877 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.877 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9589011-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 NetworkManager[49104]: <info>  [1768920895.8796] manager: (tape9589011-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 20 09:54:55 np0005588919 kernel: tape9589011-b0: entered promiscuous mode
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.881 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.882 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9589011-b0, col_values=(('external_ids', {'iface-id': '9ca9d06a-9365-4769-a2c4-7322625683ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.883 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:55Z|00503|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:54:55 np0005588919 nova_compute[225855]: 2026-01-20 14:54:55.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.899 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.900 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eca5d3e5-dd44-4821-ad41-4456380c285b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.901 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:54:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:54:55.902 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'env', 'PROCESS_TAG=haproxy-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9589011-b728-4b79-9945-aa6c52dd0fc2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:54:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:55.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.019 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920896.0183027, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.019 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Started (Lifecycle Event)#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.040 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.043 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.047 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920896.019713, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.047 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.078 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.081 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.104 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:56 np0005588919 podman[275123]: 2026-01-20 14:54:56.357982826 +0000 UTC m=+0.090894627 container create 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:54:56 np0005588919 podman[275123]: 2026-01-20 14:54:56.299709881 +0000 UTC m=+0.032621702 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:54:56 np0005588919 systemd[1]: Started libpod-conmon-35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6.scope.
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.454 225859 DEBUG nova.compute.manager [req-1173ce9e-d191-4d3a-8ae8-65115d4a950d req-f838c079-2d26-45ad-bafd-2b051183597f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.455 225859 DEBUG oslo_concurrency.lockutils [req-1173ce9e-d191-4d3a-8ae8-65115d4a950d req-f838c079-2d26-45ad-bafd-2b051183597f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.455 225859 DEBUG oslo_concurrency.lockutils [req-1173ce9e-d191-4d3a-8ae8-65115d4a950d req-f838c079-2d26-45ad-bafd-2b051183597f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.455 225859 DEBUG oslo_concurrency.lockutils [req-1173ce9e-d191-4d3a-8ae8-65115d4a950d req-f838c079-2d26-45ad-bafd-2b051183597f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.455 225859 DEBUG nova.compute.manager [req-1173ce9e-d191-4d3a-8ae8-65115d4a950d req-f838c079-2d26-45ad-bafd-2b051183597f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Processing event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.456 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:54:56 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.459 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920896.4596813, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.460 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.462 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:54:56 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bba5b462764fe5156f5cb57f1fc6d7aae20c07ad8940e8ed4d30f072b3f5c46d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.473 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance spawned successfully.#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.473 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.478 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.481 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.506 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.507 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.508 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.508 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.508 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.509 225859 DEBUG nova.virt.libvirt.driver [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:54:56 np0005588919 podman[275123]: 2026-01-20 14:54:56.517941572 +0000 UTC m=+0.250853403 container init 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:54:56 np0005588919 podman[275123]: 2026-01-20 14:54:56.524121706 +0000 UTC m=+0.257033517 container start 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:54:56 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [NOTICE]   (275143) : New worker (275145) forked
Jan 20 09:54:56 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [NOTICE]   (275143) : Loading success.
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.567 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.625 225859 INFO nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Took 7.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.625 225859 DEBUG nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.718 225859 INFO nova.compute.manager [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Took 8.38 seconds to build instance.#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.736 225859 DEBUG oslo_concurrency.lockutils [None req-0ec32ccc-7ae6-48ef-bfe0-8dad6c8c8aef 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.988 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920881.9873397, 23ea4537-f03f-46de-881f-b979e232a3b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:54:56 np0005588919 nova_compute[225855]: 2026-01-20 14:54:56.989 225859 INFO nova.compute.manager [-] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:54:57 np0005588919 nova_compute[225855]: 2026-01-20 14:54:57.013 225859 DEBUG nova.compute.manager [None req-b69ccc37-485c-4beb-9f26-21ec25ce8965 - - - - - -] [instance: 23ea4537-f03f-46de-881f-b979e232a3b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:54:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:57.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e301 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:54:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:57.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:58 np0005588919 nova_compute[225855]: 2026-01-20 14:54:58.698 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.150 225859 DEBUG nova.compute.manager [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.150 225859 DEBUG oslo_concurrency.lockutils [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.150 225859 DEBUG oslo_concurrency.lockutils [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.150 225859 DEBUG oslo_concurrency.lockutils [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.151 225859 DEBUG nova.compute.manager [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.151 225859 WARNING nova.compute.manager [req-3a3c36b3-a35e-4e47-8aea-d75a8b644eff req-dc2f0173-9909-4d54-9a08-d5fc88f7c93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received unexpected event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with vm_state active and task_state None.#033[00m
Jan 20 09:54:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:54:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:54:59.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:54:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:59 np0005588919 NetworkManager[49104]: <info>  [1768920899.8138] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Jan 20 09:54:59 np0005588919 NetworkManager[49104]: <info>  [1768920899.8148] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.933 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:54:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:54:59Z|00504|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:54:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:54:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:54:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:54:59.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:54:59 np0005588919 nova_compute[225855]: 2026-01-20 14:54:59.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:01.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.597 225859 DEBUG nova.compute.manager [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.598 225859 DEBUG nova.compute.manager [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing instance network info cache due to event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.598 225859 DEBUG oslo_concurrency.lockutils [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.598 225859 DEBUG oslo_concurrency.lockutils [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.598 225859 DEBUG nova.network.neutron [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:55:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:01Z|00505|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:55:01 np0005588919 nova_compute[225855]: 2026-01-20 14:55:01.788 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:01.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:03.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:03 np0005588919 nova_compute[225855]: 2026-01-20 14:55:03.701 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:03.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:05 np0005588919 nova_compute[225855]: 2026-01-20 14:55:05.175 225859 DEBUG nova.network.neutron [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updated VIF entry in instance network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:55:05 np0005588919 nova_compute[225855]: 2026-01-20 14:55:05.176 225859 DEBUG nova.network.neutron [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:05 np0005588919 nova_compute[225855]: 2026-01-20 14:55:05.229 225859 DEBUG oslo_concurrency.lockutils [req-8903823b-89fc-407d-8919-cfe9a67c1d38 req-15fc6e41-9dc4-45e1-92ae-799cb94a969f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:05.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:05.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:06 np0005588919 nova_compute[225855]: 2026-01-20 14:55:06.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:07.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:07.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:08 np0005588919 nova_compute[225855]: 2026-01-20 14:55:08.704 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:09.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:09.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:10Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:8e:0f 10.100.0.14
Jan 20 09:55:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:10Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:8e:0f 10.100.0.14
Jan 20 09:55:11 np0005588919 nova_compute[225855]: 2026-01-20 14:55:11.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:11.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:11.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:12 np0005588919 podman[275188]: 2026-01-20 14:55:12.377886796 +0000 UTC m=+0.101162107 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:55:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:13.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:13 np0005588919 nova_compute[225855]: 2026-01-20 14:55:13.707 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:13.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:15.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:15 np0005588919 nova_compute[225855]: 2026-01-20 14:55:15.457 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:15.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:16 np0005588919 nova_compute[225855]: 2026-01-20 14:55:16.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:16.414 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:16.415 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:16.416 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:17.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:17.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:55:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:55:18 np0005588919 nova_compute[225855]: 2026-01-20 14:55:18.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:18 np0005588919 nova_compute[225855]: 2026-01-20 14:55:18.712 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:18 np0005588919 nova_compute[225855]: 2026-01-20 14:55:18.712 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:18 np0005588919 nova_compute[225855]: 2026-01-20 14:55:18.713 225859 INFO nova.compute.manager [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Shelving#033[00m
Jan 20 09:55:18 np0005588919 nova_compute[225855]: 2026-01-20 14:55:18.738 225859 DEBUG nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:55:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:19.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:19.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:20 np0005588919 nova_compute[225855]: 2026-01-20 14:55:20.232 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.112 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:21.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:21 np0005588919 kernel: tape084df8c-a7 (unregistering): left promiscuous mode
Jan 20 09:55:21 np0005588919 NetworkManager[49104]: <info>  [1768920921.7351] device (tape084df8c-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:55:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:21Z|00506|binding|INFO|Releasing lport e084df8c-a73e-4535-bcf7-de8adbafa9ae from this chassis (sb_readonly=0)
Jan 20 09:55:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:21Z|00507|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae down in Southbound
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.745 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:21Z|00508|binding|INFO|Removing iface tape084df8c-a7 ovn-installed in OVS
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.752 225859 INFO nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 09:55:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:21.758 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:8e:0f 10.100.0.14'], port_security=['fa:16:3e:89:8e:0f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e88a960e-8540-4c69-934d-b6e1b91beb98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e084df8c-a73e-4535-bcf7-de8adbafa9ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:21.759 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e084df8c-a73e-4535-bcf7-de8adbafa9ae in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 unbound from our chassis#033[00m
Jan 20 09:55:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:21.760 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9589011-b728-4b79-9945-aa6c52dd0fc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:55:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:21.762 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[20ff6c6c-2724-4580-a4d2-372999bfe7a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:21.762 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace which is not needed anymore#033[00m
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.766 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:21 np0005588919 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Jan 20 09:55:21 np0005588919 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Consumed 14.362s CPU time.
Jan 20 09:55:21 np0005588919 systemd-machined[194361]: Machine qemu-59-instance-0000007c terminated.
Jan 20 09:55:21 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [NOTICE]   (275143) : haproxy version is 2.8.14-c23fe91
Jan 20 09:55:21 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [NOTICE]   (275143) : path to executable is /usr/sbin/haproxy
Jan 20 09:55:21 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [WARNING]  (275143) : Exiting Master process...
Jan 20 09:55:21 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [ALERT]    (275143) : Current worker (275145) exited with code 143 (Terminated)
Jan 20 09:55:21 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[275139]: [WARNING]  (275143) : All workers exited. Exiting... (0)
Jan 20 09:55:21 np0005588919 systemd[1]: libpod-35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6.scope: Deactivated successfully.
Jan 20 09:55:21 np0005588919 podman[275401]: 2026-01-20 14:55:21.902421473 +0000 UTC m=+0.056095885 container died 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:55:21 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6-userdata-shm.mount: Deactivated successfully.
Jan 20 09:55:21 np0005588919 systemd[1]: var-lib-containers-storage-overlay-bba5b462764fe5156f5cb57f1fc6d7aae20c07ad8940e8ed4d30f072b3f5c46d-merged.mount: Deactivated successfully.
Jan 20 09:55:21 np0005588919 podman[275401]: 2026-01-20 14:55:21.942579586 +0000 UTC m=+0.096253998 container cleanup 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 09:55:21 np0005588919 systemd[1]: libpod-conmon-35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6.scope: Deactivated successfully.
Jan 20 09:55:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:21.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.990 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance destroyed successfully.#033[00m
Jan 20 09:55:21 np0005588919 nova_compute[225855]: 2026-01-20 14:55:21.990 225859 DEBUG nova.objects.instance [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:22 np0005588919 podman[275429]: 2026-01-20 14:55:22.010488283 +0000 UTC m=+0.045592188 container remove 35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.016 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b18c7b36-9622-4ac2-8df2-783acdae388b]: (4, ('Tue Jan 20 02:55:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6)\n35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6\nTue Jan 20 02:55:21 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6)\n35c1b3425f1e0506111fa57ff509641c7dc2b69df752675b10a514f5d11a05d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.018 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41cb4918-0383-4728-959a-4832eda36722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.019 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:22 np0005588919 kernel: tape9589011-b0: left promiscuous mode
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.082 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[489d2aa3-0937-42df-ad9e-1d0e76eac0ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.100 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d400a3-818f-4a56-9a99-6f2e5e0e0846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.102 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[53e5d719-d2f3-4096-ad03-5dc3bf03ab80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.118 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f74c4cd9-4167-40f7-a9d8-1554c7fed0f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593460, 'reachable_time': 16588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275458, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.120 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:55:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:22.121 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[98c1bcec-525b-4cf6-b4df-9152004c68bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:22 np0005588919 systemd[1]: run-netns-ovnmeta\x2de9589011\x2db728\x2d4b79\x2d9945\x2daa6c52dd0fc2.mount: Deactivated successfully.
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.348 225859 INFO nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Beginning cold snapshot process#033[00m
Jan 20 09:55:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e302 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.511 225859 DEBUG nova.virt.libvirt.imagebackend [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.574 225859 DEBUG nova.compute.manager [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.574 225859 DEBUG oslo_concurrency.lockutils [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.574 225859 DEBUG oslo_concurrency.lockutils [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.575 225859 DEBUG oslo_concurrency.lockutils [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.575 225859 DEBUG nova.compute.manager [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.575 225859 WARNING nova.compute.manager [req-2347be6e-aacf-4810-a315-784977e43de6 req-f24e8006-a367-4c71-b5ec-1fe829586403 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received unexpected event network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:55:22 np0005588919 nova_compute[225855]: 2026-01-20 14:55:22.761 225859 DEBUG nova.storage.rbd_utils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] creating snapshot(20d7c18e7d794b51839e3145b4ba1f66) on rbd image(3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:55:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 20 09:55:23 np0005588919 nova_compute[225855]: 2026-01-20 14:55:23.186 225859 DEBUG nova.storage.rbd_utils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] cloning vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk@20d7c18e7d794b51839e3145b4ba1f66 to images/0f1d91a7-05af-4ed6-87af-1e03976e25f0 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:55:23 np0005588919 nova_compute[225855]: 2026-01-20 14:55:23.312 225859 DEBUG nova.storage.rbd_utils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] flattening images/0f1d91a7-05af-4ed6-87af-1e03976e25f0 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:55:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:23.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:23 np0005588919 nova_compute[225855]: 2026-01-20 14:55:23.704 225859 DEBUG nova.storage.rbd_utils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] removing snapshot(20d7c18e7d794b51839e3145b4ba1f66) on rbd image(3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 09:55:23 np0005588919 nova_compute[225855]: 2026-01-20 14:55:23.712 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:23.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.149 225859 DEBUG nova.storage.rbd_utils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] creating snapshot(snap) on rbd image(0f1d91a7-05af-4ed6-87af-1e03976e25f0) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 09:55:24 np0005588919 podman[275625]: 2026-01-20 14:55:24.500730595 +0000 UTC m=+0.077738616 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.751 225859 DEBUG nova.compute.manager [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.751 225859 DEBUG oslo_concurrency.lockutils [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.752 225859 DEBUG oslo_concurrency.lockutils [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.752 225859 DEBUG oslo_concurrency.lockutils [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.752 225859 DEBUG nova.compute.manager [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:24 np0005588919 nova_compute[225855]: 2026-01-20 14:55:24.753 225859 WARNING nova.compute.manager [req-ddd81a93-6c68-4a1b-9fe0-c08001dad499 req-c97b90d9-daed-4102-8e4d-4623867a03ab 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received unexpected event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 20 09:55:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:55:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:25.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:25 np0005588919 nova_compute[225855]: 2026-01-20 14:55:25.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 20 09:55:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:25.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:26 np0005588919 nova_compute[225855]: 2026-01-20 14:55:26.051 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.719 225859 INFO nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Snapshot image upload complete#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.720 225859 DEBUG nova.compute.manager [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.792 225859 INFO nova.compute.manager [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Shelve offloading#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.798 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance destroyed successfully.#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.798 225859 DEBUG nova.compute.manager [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.800 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.800 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:27 np0005588919 nova_compute[225855]: 2026-01-20 14:55:27.800 225859 DEBUG nova.network.neutron [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:55:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:27.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:28 np0005588919 nova_compute[225855]: 2026-01-20 14:55:28.753 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:29 np0005588919 nova_compute[225855]: 2026-01-20 14:55:29.673 225859 DEBUG nova.network.neutron [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:29 np0005588919 nova_compute[225855]: 2026-01-20 14:55:29.702 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:31.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.802 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance destroyed successfully.#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.803 225859 DEBUG nova.objects.instance [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'resources' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.826 225859 DEBUG nova.virt.libvirt.vif [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEiICz3SRHAbg35RWQTkKKcPH7nuzl556rWhnnJMKWCQeHRtCTrp3rh0Cew3QLmsFdOqe88XbxeaMKtgT6L6nfvjZZnoyEjqVogiPNh8/V6NYBD5v71aQZWpX0o+tqsUvg==',key_name='tempest-keypair-1388890452',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:55:27.720433',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='0f1d91a7-05af-4ed6-87af-1e03976e25f0'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:55:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.826 225859 DEBUG nova.network.os_vif_util [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.827 225859 DEBUG nova.network.os_vif_util [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.827 225859 DEBUG os_vif [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.829 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.829 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape084df8c-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.833 225859 INFO os_vif [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7')#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.911 225859 DEBUG nova.compute.manager [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.911 225859 DEBUG nova.compute.manager [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing instance network info cache due to event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.912 225859 DEBUG oslo_concurrency.lockutils [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.912 225859 DEBUG oslo_concurrency.lockutils [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:31 np0005588919 nova_compute[225855]: 2026-01-20 14:55:31.912 225859 DEBUG nova.network.neutron [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:55:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:31.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.320 225859 INFO nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deleting instance files /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_del#033[00m
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.321 225859 INFO nova.virt.libvirt.driver [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deletion of /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_del complete#033[00m
Jan 20 09:55:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.802 225859 INFO nova.scheduler.client.report [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Deleted allocations for instance 3bec73f6-5255-44c0-8a10-a64c7e86c0c2#033[00m
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.870 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.871 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:32 np0005588919 nova_compute[225855]: 2026-01-20 14:55:32.913 225859 DEBUG oslo_concurrency.processutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3124150087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:33.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.356 225859 DEBUG oslo_concurrency.processutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.362 225859 DEBUG nova.compute.provider_tree [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.379 225859 DEBUG nova.scheduler.client.report [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.407 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.486 225859 DEBUG oslo_concurrency.lockutils [None req-d43ac6da-3dd9-44d7-8c73-72899d49ddd7 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 14.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.744 225859 DEBUG nova.network.neutron [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updated VIF entry in instance network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.745 225859 DEBUG nova.network.neutron [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tape084df8c-a7", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:33 np0005588919 nova_compute[225855]: 2026-01-20 14:55:33.813 225859 DEBUG oslo_concurrency.lockutils [req-35707b01-1d7d-48c6-8f04-277acfc50f18 req-693a5909-cf78-4bcf-98cf-2d919aba1f89 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:33.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 20 09:55:35 np0005588919 nova_compute[225855]: 2026-01-20 14:55:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:35.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:35.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:36.025 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:36.026 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.092 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.987 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920921.9858956, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:36 np0005588919 nova_compute[225855]: 2026-01-20 14:55:36.988 225859 INFO nova.compute.manager [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:55:37 np0005588919 nova_compute[225855]: 2026-01-20 14:55:37.068 225859 DEBUG nova.compute.manager [None req-6ae1fa24-5580-4fb6-bdb3-45fc2a59bc57 - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:37 np0005588919 nova_compute[225855]: 2026-01-20 14:55:37.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:37 np0005588919 nova_compute[225855]: 2026-01-20 14:55:37.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:55:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:37.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.428 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.439 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.440 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.440 225859 INFO nova.compute.manager [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Unshelving#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.634 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.634 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.638 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.652 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.662 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.662 225859 INFO nova.compute.claims [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:55:38 np0005588919 nova_compute[225855]: 2026-01-20 14:55:38.876 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/641791931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:39 np0005588919 nova_compute[225855]: 2026-01-20 14:55:39.379 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:39 np0005588919 nova_compute[225855]: 2026-01-20 14:55:39.387 225859 DEBUG nova.compute.provider_tree [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:39 np0005588919 nova_compute[225855]: 2026-01-20 14:55:39.410 225859 DEBUG nova.scheduler.client.report [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:39 np0005588919 nova_compute[225855]: 2026-01-20 14:55:39.456 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:39 np0005588919 nova_compute[225855]: 2026-01-20 14:55:39.959 225859 INFO nova.network.neutron [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating port e084df8c-a73e-4535-bcf7-de8adbafa9ae with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 09:55:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:39.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:41.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.831 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.832 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.832 225859 DEBUG nova.network.neutron [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.979 225859 DEBUG nova.compute.manager [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.979 225859 DEBUG nova.compute.manager [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing instance network info cache due to event network-changed-e084df8c-a73e-4535-bcf7-de8adbafa9ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:55:41 np0005588919 nova_compute[225855]: 2026-01-20 14:55:41.980 225859 DEBUG oslo_concurrency.lockutils [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:55:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:41.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:43 np0005588919 podman[275792]: 2026-01-20 14:55:43.051623889 +0000 UTC m=+0.083387615 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:55:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:43.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.796 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.895 225859 DEBUG nova.network.neutron [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.920 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.922 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.922 225859 INFO nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating image(s)#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.949 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.953 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.954 225859 DEBUG oslo_concurrency.lockutils [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.955 225859 DEBUG nova.network.neutron [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Refreshing network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:55:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:43.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:43 np0005588919 nova_compute[225855]: 2026-01-20 14:55:43.993 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.018 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.022 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "2f18fd61310b7a6e1fac51a6ca49bea435dc548b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.023 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "2f18fd61310b7a6e1fac51a6ca49bea435dc548b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.038 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.450 225859 DEBUG nova.virt.libvirt.imagebackend [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0f1d91a7-05af-4ed6-87af-1e03976e25f0/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0f1d91a7-05af-4ed6-87af-1e03976e25f0/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.516 225859 DEBUG nova.virt.libvirt.imagebackend [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/0f1d91a7-05af-4ed6-87af-1e03976e25f0/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.517 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] cloning images/0f1d91a7-05af-4ed6-87af-1e03976e25f0@snap to None/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.631 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "2f18fd61310b7a6e1fac51a6ca49bea435dc548b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.791 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:44 np0005588919 nova_compute[225855]: 2026-01-20 14:55:44.856 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] flattening vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.353 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Image rbd:vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.354 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.354 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Ensure instance console log exists: /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:55:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.355 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.355 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:45.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.355 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.357 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start _get_guest_xml network_info=[{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:55:18Z,direct_url=<?>,disk_format='raw',id=0f1d91a7-05af-4ed6-87af-1e03976e25f0,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-913712707-shelved',owner='b1e83af992c94112a965575784639d77',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:55:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.362 225859 WARNING nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.382 225859 DEBUG nova.virt.libvirt.host [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.383 225859 DEBUG nova.virt.libvirt.host [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.386 225859 DEBUG nova.virt.libvirt.host [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.387 225859 DEBUG nova.virt.libvirt.host [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.388 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.388 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-20T14:55:18Z,direct_url=<?>,disk_format='raw',id=0f1d91a7-05af-4ed6-87af-1e03976e25f0,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-913712707-shelved',owner='b1e83af992c94112a965575784639d77',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-20T14:55:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.389 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.389 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.389 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.390 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.390 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.390 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.390 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.391 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.391 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.391 225859 DEBUG nova.virt.hardware [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.391 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.408 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/387418848' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.863 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.893 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:45 np0005588919 nova_compute[225855]: 2026-01-20 14:55:45.897 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:45.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:46.027 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.095 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.122 225859 DEBUG nova.network.neutron [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updated VIF entry in instance network info cache for port e084df8c-a73e-4535-bcf7-de8adbafa9ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.123 225859 DEBUG nova.network.neutron [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [{"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.145 225859 DEBUG oslo_concurrency.lockutils [req-23f35e16-0c16-45cd-a5bf-fee625d47a4d req-a01a4132-b35a-47fa-93c4-6bfd9c1e5f56 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3bec73f6-5255-44c0-8a10-a64c7e86c0c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:55:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1562014082' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.369 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.369 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.398 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.401 225859 DEBUG nova.virt.libvirt.vif [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='0f1d91a7-05af-4ed6-87af-1e03976e25f0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1388890452',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:55:27.720433',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='0f1d91a7-05af-4ed6-87af-1e03976e25f0'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.401 225859 DEBUG nova.network.os_vif_util [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.403 225859 DEBUG nova.network.os_vif_util [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.404 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.428 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <uuid>3bec73f6-5255-44c0-8a10-a64c7e86c0c2</uuid>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <name>instance-0000007c</name>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-913712707</nova:name>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:55:45</nova:creationTime>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:user uuid="34eb73f628994c11801d447148d5f142">tempest-AttachVolumeShelveTestJSON-896995479-project-member</nova:user>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:project uuid="b1e83af992c94112a965575784639d77">tempest-AttachVolumeShelveTestJSON-896995479</nova:project>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="0f1d91a7-05af-4ed6-87af-1e03976e25f0"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <nova:port uuid="e084df8c-a73e-4535-bcf7-de8adbafa9ae">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="serial">3bec73f6-5255-44c0-8a10-a64c7e86c0c2</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="uuid">3bec73f6-5255-44c0-8a10-a64c7e86c0c2</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:89:8e:0f"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <target dev="tape084df8c-a7"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/console.log" append="off"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:55:46 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:55:46 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:55:46 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:55:46 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.430 225859 DEBUG nova.compute.manager [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Preparing to wait for external event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.431 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.431 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.431 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.432 225859 DEBUG nova.virt.libvirt.vif [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='0f1d91a7-05af-4ed6-87af-1e03976e25f0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1388890452',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:54:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member',shelved_at='2026-01-20T14:55:27.720433',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='0f1d91a7-05af-4ed6-87af-1e03976e25f0'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:55:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.432 225859 DEBUG nova.network.os_vif_util [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.433 225859 DEBUG nova.network.os_vif_util [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.433 225859 DEBUG os_vif [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.434 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.435 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.437 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.438 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape084df8c-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.439 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape084df8c-a7, col_values=(('external_ids', {'iface-id': 'e084df8c-a73e-4535-bcf7-de8adbafa9ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:8e:0f', 'vm-uuid': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:46 np0005588919 NetworkManager[49104]: <info>  [1768920946.4411] manager: (tape084df8c-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.442 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.448 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.449 225859 INFO os_vif [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7')#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.526 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.526 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.527 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] No VIF found with MAC fa:16:3e:89:8e:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.528 225859 INFO nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Using config drive#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.556 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.578 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.634 225859 DEBUG nova.objects.instance [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'keypairs' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:55:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/541677167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.808 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.855 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000007c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.855 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000007c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.969 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.970 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4348MB free_disk=20.853878021240234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.971 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:46 np0005588919 nova_compute[225855]: 2026-01-20 14:55:46.971 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.013 225859 INFO nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Creating config drive at /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.018 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcwgnu9oe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.071 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.071 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.072 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.108 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.149 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcwgnu9oe" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.178 225859 DEBUG nova.storage.rbd_utils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] rbd image 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.182 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.329 225859 DEBUG oslo_concurrency.processutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config 3bec73f6-5255-44c0-8a10-a64c7e86c0c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.329 225859 INFO nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deleting local config drive /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2/disk.config because it was imported into RBD.#033[00m
Jan 20 09:55:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:47.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:47 np0005588919 kernel: tape084df8c-a7: entered promiscuous mode
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.376 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.3774] manager: (tape084df8c-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:47Z|00509|binding|INFO|Claiming lport e084df8c-a73e-4535-bcf7-de8adbafa9ae for this chassis.
Jan 20 09:55:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:47Z|00510|binding|INFO|e084df8c-a73e-4535-bcf7-de8adbafa9ae: Claiming fa:16:3e:89:8e:0f 10.100.0.14
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.386 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.391 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.397 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.4001] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.4010] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.404 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:8e:0f 10.100.0.14'], port_security=['fa:16:3e:89:8e:0f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'e88a960e-8540-4c69-934d-b6e1b91beb98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e084df8c-a73e-4535-bcf7-de8adbafa9ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.405 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e084df8c-a73e-4535-bcf7-de8adbafa9ae in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 bound to our chassis#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.406 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9589011-b728-4b79-9945-aa6c52dd0fc2#033[00m
Jan 20 09:55:47 np0005588919 systemd-machined[194361]: New machine qemu-60-instance-0000007c.
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.419 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6685728e-3146-4d43-a620-06302305ab80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.419 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9589011-b1 in ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.421 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9589011-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.421 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[69b19a9e-ed24-499e-829b-4be4d10b4c87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.422 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8c283884-b1d3-4d4f-8869-02e55c599bbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 systemd[1]: Started Virtual Machine qemu-60-instance-0000007c.
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.435 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[00055eed-e34a-42b1-8f5f-9fc4dfc14818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 systemd-udevd[276214]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.451 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b4b0f9d-a493-4ea5-8574-7fe238833dfe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.4617] device (tape084df8c-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.4624] device (tape084df8c-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.481 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5365d447-da3c-4b4b-a454-e07aa52b4b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:47 np0005588919 systemd-udevd[276217]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.5005] manager: (tape9589011-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/219)
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.501 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1454503a-7911-4820-a01b-3a6ab9a315e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.528 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[810c2b8d-f481-43a6-85f1-da7241a420f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.531 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bf40ccb2-b6aa-4323-8482-85c723a2a7bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:55:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2048446538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.5514] device (tape9589011-b0): carrier: link connected
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.556 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[dc322a48-5634-4198-9924-af7662e796ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.563 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.568 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.571 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bce37ea1-377e-4086-9d49-e26ced334828]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598647, 'reachable_time': 22390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276246, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.589 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.603 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.603 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[17d815b1-d11f-447e-a88d-9bd630308f27]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:5a14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598647, 'tstamp': 598647}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276247, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.613 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.614 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:47Z|00511|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae ovn-installed in OVS
Jan 20 09:55:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:47Z|00512|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae up in Southbound
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.619 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a10e8714-c6b5-4158-b594-ddd9d30fb028]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9589011-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:5a:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598647, 'reachable_time': 22390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276249, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.649 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1aba8f-7393-4650-80a1-9c4111667fec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.703 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b5fc96a2-b30b-4838-b4c8-f6c009fbf882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.705 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.705 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.705 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9589011-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.707 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 kernel: tape9589011-b0: entered promiscuous mode
Jan 20 09:55:47 np0005588919 NetworkManager[49104]: <info>  [1768920947.7081] manager: (tape9589011-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.709 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.710 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9589011-b0, col_values=(('external_ids', {'iface-id': '9ca9d06a-9365-4769-a2c4-7322625683ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_controller[130490]: 2026-01-20T14:55:47Z|00513|binding|INFO|Releasing lport 9ca9d06a-9365-4769-a2c4-7322625683ac from this chassis (sb_readonly=0)
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.727 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.728 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd566c6-0707-40d3-b537-1b7d8eba3600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.729 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/e9589011-b728-4b79-9945-aa6c52dd0fc2.pid.haproxy
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID e9589011-b728-4b79-9945-aa6c52dd0fc2
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:55:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:55:47.730 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'env', 'PROCESS_TAG=haproxy-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9589011-b728-4b79-9945-aa6c52dd0fc2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.943 225859 DEBUG nova.compute.manager [req-74e89d76-f67f-4246-b948-64d9ecf34e5f req-9889af3c-bd24-4645-9463-73f65ea9adcd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.944 225859 DEBUG oslo_concurrency.lockutils [req-74e89d76-f67f-4246-b948-64d9ecf34e5f req-9889af3c-bd24-4645-9463-73f65ea9adcd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.944 225859 DEBUG oslo_concurrency.lockutils [req-74e89d76-f67f-4246-b948-64d9ecf34e5f req-9889af3c-bd24-4645-9463-73f65ea9adcd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.944 225859 DEBUG oslo_concurrency.lockutils [req-74e89d76-f67f-4246-b948-64d9ecf34e5f req-9889af3c-bd24-4645-9463-73f65ea9adcd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:47 np0005588919 nova_compute[225855]: 2026-01-20 14:55:47.945 225859 DEBUG nova.compute.manager [req-74e89d76-f67f-4246-b948-64d9ecf34e5f req-9889af3c-bd24-4645-9463-73f65ea9adcd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Processing event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:55:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:55:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:47.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:55:48 np0005588919 podman[276281]: 2026-01-20 14:55:48.110725674 +0000 UTC m=+0.054290214 container create a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 09:55:48 np0005588919 systemd[1]: Started libpod-conmon-a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605.scope.
Jan 20 09:55:48 np0005588919 podman[276281]: 2026-01-20 14:55:48.079142192 +0000 UTC m=+0.022706762 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:55:48 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:55:48 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b58477d32c8948a9385d790e9b6a62f6a46c01c4ace31c1af6ef64bffa12e02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:55:48 np0005588919 podman[276281]: 2026-01-20 14:55:48.206030794 +0000 UTC m=+0.149595354 container init a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 09:55:48 np0005588919 podman[276281]: 2026-01-20 14:55:48.212307351 +0000 UTC m=+0.155871891 container start a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:55:48 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [NOTICE]   (276301) : New worker (276303) forked
Jan 20 09:55:48 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [NOTICE]   (276301) : Loading success.
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.409 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920948.4093874, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.410 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Started (Lifecycle Event)#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.412 225859 DEBUG nova.compute.manager [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.415 225859 DEBUG nova.virt.libvirt.driver [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.417 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance spawned successfully.#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.447 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.451 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.480 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.482 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920948.4095788, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.482 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.501 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.504 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920948.4142823, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.505 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.533 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.536 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:55:48 np0005588919 nova_compute[225855]: 2026-01-20 14:55:48.573 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:55:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:49.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 20 09:55:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:49.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.018 225859 DEBUG nova.compute.manager [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.018 225859 DEBUG oslo_concurrency.lockutils [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.019 225859 DEBUG oslo_concurrency.lockutils [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.019 225859 DEBUG oslo_concurrency.lockutils [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.019 225859 DEBUG nova.compute.manager [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.019 225859 WARNING nova.compute.manager [req-a7c4c67e-75c5-4d75-8ade-4895cd5e75a9 req-b1ad84d0-df78-4c73-b790-b392f0a359a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received unexpected event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.609 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.610 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.785 225859 DEBUG nova.compute.manager [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:55:50 np0005588919 nova_compute[225855]: 2026-01-20 14:55:50.866 225859 DEBUG oslo_concurrency.lockutils [None req-2e8348fc-730c-43ea-954a-8f2243213414 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 12.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:55:51 np0005588919 nova_compute[225855]: 2026-01-20 14:55:51.142 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:51.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:51 np0005588919 nova_compute[225855]: 2026-01-20 14:55:51.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:51 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 20 09:55:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:51.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:53.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:53 np0005588919 nova_compute[225855]: 2026-01-20 14:55:53.399 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:53.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.936443) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954936476, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 1180, "num_deletes": 254, "total_data_size": 2204638, "memory_usage": 2254672, "flush_reason": "Manual Compaction"}
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954955710, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 977776, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49742, "largest_seqno": 50917, "table_properties": {"data_size": 973325, "index_size": 1911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12214, "raw_average_key_size": 21, "raw_value_size": 963598, "raw_average_value_size": 1702, "num_data_blocks": 83, "num_entries": 566, "num_filter_entries": 566, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920881, "oldest_key_time": 1768920881, "file_creation_time": 1768920954, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 19311 microseconds, and 3354 cpu microseconds.
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.955753) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 977776 bytes OK
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.955771) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.957188) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.957202) EVENT_LOG_v1 {"time_micros": 1768920954957198, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.957222) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 2198836, prev total WAL file size 2198836, number of live WAL files 2.
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.957952) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353035' seq:72057594037927935, type:22 .. '6D6772737461740031373536' seq:0, type:0; will stop at (end)
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(954KB)], [96(11MB)]
Jan 20 09:55:54 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920954957983, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 13450642, "oldest_snapshot_seqno": -1}
Jan 20 09:55:55 np0005588919 podman[276407]: 2026-01-20 14:55:55.036942399 +0000 UTC m=+0.080643098 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 7560 keys, 10086049 bytes, temperature: kUnknown
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955057259, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 10086049, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10036810, "index_size": 29223, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18949, "raw_key_size": 195723, "raw_average_key_size": 25, "raw_value_size": 9903078, "raw_average_value_size": 1309, "num_data_blocks": 1149, "num_entries": 7560, "num_filter_entries": 7560, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768920954, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.057548) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 10086049 bytes
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.058881) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.3 rd, 101.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.9 +0.0 blob) out(9.6 +0.0 blob), read-write-amplify(24.1) write-amplify(10.3) OK, records in: 8053, records dropped: 493 output_compression: NoCompression
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.058915) EVENT_LOG_v1 {"time_micros": 1768920955058891, "job": 60, "event": "compaction_finished", "compaction_time_micros": 99408, "compaction_time_cpu_micros": 25977, "output_level": 6, "num_output_files": 1, "total_output_size": 10086049, "num_input_records": 8053, "num_output_records": 7560, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955059144, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768920955060794, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:54.957854) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.060881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.060885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.060887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.060888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:55:55.060890) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:55:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:55.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:55:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:55.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:55:56 np0005588919 nova_compute[225855]: 2026-01-20 14:55:56.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:56 np0005588919 nova_compute[225855]: 2026-01-20 14:55:56.443 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:57.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:55:57 np0005588919 nova_compute[225855]: 2026-01-20 14:55:57.806 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:55:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:55:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:55:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:55:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:55:59.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:55:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 20 09:56:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:00.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.633 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.634 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.650 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.726 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.726 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.737 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.737 225859 INFO nova.compute.claims [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:56:00 np0005588919 nova_compute[225855]: 2026-01-20 14:56:00.836 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.149 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/533248142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.310 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.316 225859 DEBUG nova.compute.provider_tree [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.331 225859 DEBUG nova.scheduler.client.report [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.353 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.354 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:56:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:01.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.391 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.392 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.407 225859 INFO nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.423 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.445 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.513 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.515 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.515 225859 INFO nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Creating image(s)#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.540 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.566 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.591 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.595 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.659 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.660 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.660 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.661 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.689 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.693 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.930 225859 DEBUG nova.policy [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f6f144f1d330427e82e84c891e9a8a89', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4be5b75b5dcb4eeea9759f7c4a779ffa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:56:01 np0005588919 nova_compute[225855]: 2026-01-20 14:56:01.997 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.304s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:02.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:02 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:02Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:8e:0f 10.100.0.14
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.075 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] resizing rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.175 225859 DEBUG nova.objects.instance [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lazy-loading 'migration_context' on Instance uuid f6f09d34-bc44-451f-98e2-1b0701aeab3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.193 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.194 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Ensure instance console log exists: /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.195 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.196 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:02 np0005588919 nova_compute[225855]: 2026-01-20 14:56:02.196 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:03 np0005588919 nova_compute[225855]: 2026-01-20 14:56:03.321 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Successfully created port: 73ed9acf-a178-4d9c-98a3-25f22489d41d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:56:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:03.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:04.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.076 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Successfully updated port: 73ed9acf-a178-4d9c-98a3-25f22489d41d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.093 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.094 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquired lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.094 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.157 225859 DEBUG nova.compute.manager [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-changed-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.158 225859 DEBUG nova.compute.manager [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Refreshing instance network info cache due to event network-changed-73ed9acf-a178-4d9c-98a3-25f22489d41d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.159 225859 DEBUG oslo_concurrency.lockutils [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:05 np0005588919 nova_compute[225855]: 2026-01-20 14:56:05.215 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:56:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:05.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:06.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.151 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.448 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.471 225859 DEBUG nova.network.neutron [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Updating instance_info_cache with network_info: [{"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.494 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Releasing lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.494 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Instance network_info: |[{"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.494 225859 DEBUG oslo_concurrency.lockutils [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.494 225859 DEBUG nova.network.neutron [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Refreshing network info cache for port 73ed9acf-a178-4d9c-98a3-25f22489d41d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.496 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Start _get_guest_xml network_info=[{"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.502 225859 WARNING nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.505 225859 DEBUG nova.virt.libvirt.host [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.506 225859 DEBUG nova.virt.libvirt.host [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.511 225859 DEBUG nova.virt.libvirt.host [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.511 225859 DEBUG nova.virt.libvirt.host [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.512 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.512 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.513 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.513 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.513 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.513 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.514 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.514 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.514 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.514 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.514 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.515 225859 DEBUG nova.virt.hardware [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.518 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3100921922' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:06 np0005588919 nova_compute[225855]: 2026-01-20 14:56:06.971 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.004 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.008 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:07.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2754763687' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.440 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.442 225859 DEBUG nova.virt.libvirt.vif [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-505687879',display_name='tempest-ServerMetadataTestJSON-server-505687879',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-505687879',id=129,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4be5b75b5dcb4eeea9759f7c4a779ffa',ramdisk_id='',reservation_id='r-cbmgnji8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-599451381',owner_user_name='tempest-ServerMetadataTestJSON-599451381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:01Z,user_data=None,user_id='f6f144f1d330427e82e84c891e9a8a89',uuid=f6f09d34-bc44-451f-98e2-1b0701aeab3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.443 225859 DEBUG nova.network.os_vif_util [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converting VIF {"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.443 225859 DEBUG nova.network.os_vif_util [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.444 225859 DEBUG nova.objects.instance [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lazy-loading 'pci_devices' on Instance uuid f6f09d34-bc44-451f-98e2-1b0701aeab3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.463 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <uuid>f6f09d34-bc44-451f-98e2-1b0701aeab3a</uuid>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <name>instance-00000081</name>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerMetadataTestJSON-server-505687879</nova:name>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:56:06</nova:creationTime>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:user uuid="f6f144f1d330427e82e84c891e9a8a89">tempest-ServerMetadataTestJSON-599451381-project-member</nova:user>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:project uuid="4be5b75b5dcb4eeea9759f7c4a779ffa">tempest-ServerMetadataTestJSON-599451381</nova:project>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <nova:port uuid="73ed9acf-a178-4d9c-98a3-25f22489d41d">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="serial">f6f09d34-bc44-451f-98e2-1b0701aeab3a</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="uuid">f6f09d34-bc44-451f-98e2-1b0701aeab3a</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a5:7a:7c"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <target dev="tap73ed9acf-a1"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/console.log" append="off"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:56:07 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:56:07 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:56:07 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:56:07 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.464 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Preparing to wait for external event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.465 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.465 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.465 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.466 225859 DEBUG nova.virt.libvirt.vif [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-505687879',display_name='tempest-ServerMetadataTestJSON-server-505687879',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-505687879',id=129,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4be5b75b5dcb4eeea9759f7c4a779ffa',ramdisk_id='',reservation_id='r-cbmgnji8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-599451381',owner_user_name='tempest-ServerMetadataTestJSON-599451381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:01Z,user_data=None,user_id='f6f144f1d330427e82e84c891e9a8a89',uuid=f6f09d34-bc44-451f-98e2-1b0701aeab3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.466 225859 DEBUG nova.network.os_vif_util [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converting VIF {"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.467 225859 DEBUG nova.network.os_vif_util [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.467 225859 DEBUG os_vif [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.468 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.468 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.469 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.471 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73ed9acf-a1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.472 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73ed9acf-a1, col_values=(('external_ids', {'iface-id': '73ed9acf-a178-4d9c-98a3-25f22489d41d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:7a:7c', 'vm-uuid': 'f6f09d34-bc44-451f-98e2-1b0701aeab3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588919 NetworkManager[49104]: <info>  [1768920967.4747] manager: (tap73ed9acf-a1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.482 225859 INFO os_vif [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1')#033[00m
Jan 20 09:56:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.588 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.589 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.589 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] No VIF found with MAC fa:16:3e:a5:7a:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.589 225859 INFO nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Using config drive#033[00m
Jan 20 09:56:07 np0005588919 nova_compute[225855]: 2026-01-20 14:56:07.617 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:08.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.451 225859 INFO nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Creating config drive at /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.456 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_kdmdtvt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.586 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_kdmdtvt" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.613 225859 DEBUG nova.storage.rbd_utils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] rbd image f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.617 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.644 225859 DEBUG nova.network.neutron [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Updated VIF entry in instance network info cache for port 73ed9acf-a178-4d9c-98a3-25f22489d41d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.645 225859 DEBUG nova.network.neutron [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Updating instance_info_cache with network_info: [{"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.670 225859 DEBUG oslo_concurrency.lockutils [req-3ae2be60-b4e7-40f3-918e-bd3a0c0c2349 req-a467a7d9-f09e-4b60-8a85-939b9f66012e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f6f09d34-bc44-451f-98e2-1b0701aeab3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.770 225859 DEBUG oslo_concurrency.processutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config f6f09d34-bc44-451f-98e2-1b0701aeab3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.771 225859 INFO nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Deleting local config drive /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a/disk.config because it was imported into RBD.#033[00m
Jan 20 09:56:08 np0005588919 kernel: tap73ed9acf-a1: entered promiscuous mode
Jan 20 09:56:08 np0005588919 NetworkManager[49104]: <info>  [1768920968.8154] manager: (tap73ed9acf-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:08Z|00514|binding|INFO|Claiming lport 73ed9acf-a178-4d9c-98a3-25f22489d41d for this chassis.
Jan 20 09:56:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:08Z|00515|binding|INFO|73ed9acf-a178-4d9c-98a3-25f22489d41d: Claiming fa:16:3e:a5:7a:7c 10.100.0.7
Jan 20 09:56:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:08Z|00516|binding|INFO|Setting lport 73ed9acf-a178-4d9c-98a3-25f22489d41d ovn-installed in OVS
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588919 nova_compute[225855]: 2026-01-20 14:56:08.838 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:08 np0005588919 systemd-udevd[276752]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:08 np0005588919 NetworkManager[49104]: <info>  [1768920968.8652] device (tap73ed9acf-a1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:56:08 np0005588919 NetworkManager[49104]: <info>  [1768920968.8657] device (tap73ed9acf-a1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:56:08 np0005588919 systemd-machined[194361]: New machine qemu-61-instance-00000081.
Jan 20 09:56:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:08Z|00517|binding|INFO|Setting lport 73ed9acf-a178-4d9c-98a3-25f22489d41d up in Southbound
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.894 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:7a:7c 10.100.0.7'], port_security=['fa:16:3e:a5:7a:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6f09d34-bc44-451f-98e2-1b0701aeab3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4be5b75b5dcb4eeea9759f7c4a779ffa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6171706d-94c4-4c43-b4b2-ef4cbdfdf97c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf74ec16-3ea2-4ca6-9e5e-52ec9c203b9d, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73ed9acf-a178-4d9c-98a3-25f22489d41d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.895 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73ed9acf-a178-4d9c-98a3-25f22489d41d in datapath d3dc1854-2a38-414a-a424-2ff753e5a7da bound to our chassis#033[00m
Jan 20 09:56:08 np0005588919 systemd[1]: Started Virtual Machine qemu-61-instance-00000081.
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.897 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d3dc1854-2a38-414a-a424-2ff753e5a7da#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.909 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[13065628-197d-4fbf-ab05-58e78679a3d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.910 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd3dc1854-21 in ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.912 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd3dc1854-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.912 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3a92bb83-3a2e-45ed-891c-1bec5fd372df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.913 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[afc57135-ac02-47b4-b989-82c75fca11e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.931 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[40734ea2-2120-4f3c-96a6-aa37d31af157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.960 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[190a11ea-e13b-40d0-9b13-3a9fb70168b9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.995 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0b39ba-b548-4e7b-bc10-656a4992469f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:08.999 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0ed3c9-43e5-479b-9c21-ded3f33fcde0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 NetworkManager[49104]: <info>  [1768920969.0006] manager: (tapd3dc1854-20): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Jan 20 09:56:09 np0005588919 systemd-udevd[276754]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.026 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5ca48a-ddd7-465d-9c7c-6589625ea912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.028 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e9067d3e-c751-4662-9fd9-c68e30dd61f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 NetworkManager[49104]: <info>  [1768920969.0519] device (tapd3dc1854-20): carrier: link connected
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.057 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7a93951d-6d46-4928-8736-6f5f5375f42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.074 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8212c3b1-6556-4210-9c3e-de04accb3234]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd3dc1854-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:66:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600797, 'reachable_time': 22961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276788, 'error': None, 'target': 'ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.088 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cb04cb-5ab1-4cc1-a324-9c7594b5faa8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:66de'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600797, 'tstamp': 600797}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276789, 'error': None, 'target': 'ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[87768b0f-d99f-4979-a5ec-ee629c4625db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd3dc1854-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:66:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600797, 'reachable_time': 22961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276790, 'error': None, 'target': 'ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.132 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41da9b4e-c0f5-49c4-951b-3d01791b964f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.186 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.186 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.186 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.187 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.187 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.188 225859 INFO nova.compute.manager [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Terminating instance#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.190 225859 DEBUG nova.compute.manager [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.206 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d4a7ae-661f-4558-b3f7-52189b16e34d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.207 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3dc1854-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.207 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.207 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3dc1854-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:09 np0005588919 NetworkManager[49104]: <info>  [1768920969.2106] manager: (tapd3dc1854-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 20 09:56:09 np0005588919 kernel: tapd3dc1854-20: entered promiscuous mode
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.212 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.212 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd3dc1854-20, col_values=(('external_ids', {'iface-id': '10b0432e-3a35-4d0d-ae91-89caad81d90f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:09 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:09Z|00518|binding|INFO|Releasing lport 10b0432e-3a35-4d0d-ae91-89caad81d90f from this chassis (sb_readonly=0)
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.230 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.230 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d3dc1854-2a38-414a-a424-2ff753e5a7da.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d3dc1854-2a38-414a-a424-2ff753e5a7da.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.231 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed5511a-d115-46ba-8e93-0edd23b3d94a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.232 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-d3dc1854-2a38-414a-a424-2ff753e5a7da
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/d3dc1854-2a38-414a-a424-2ff753e5a7da.pid.haproxy
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID d3dc1854-2a38-414a-a424-2ff753e5a7da
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.232 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'env', 'PROCESS_TAG=haproxy-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d3dc1854-2a38-414a-a424-2ff753e5a7da.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:56:09 np0005588919 kernel: tape084df8c-a7 (unregistering): left promiscuous mode
Jan 20 09:56:09 np0005588919 NetworkManager[49104]: <info>  [1768920969.2621] device (tape084df8c-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:56:09 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:09Z|00519|binding|INFO|Releasing lport e084df8c-a73e-4535-bcf7-de8adbafa9ae from this chassis (sb_readonly=0)
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:09Z|00520|binding|INFO|Setting lport e084df8c-a73e-4535-bcf7-de8adbafa9ae down in Southbound
Jan 20 09:56:09 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:09Z|00521|binding|INFO|Removing iface tape084df8c-a7 ovn-installed in OVS
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.270 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.276 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:8e:0f 10.100.0.14'], port_security=['fa:16:3e:89:8e:0f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3bec73f6-5255-44c0-8a10-a64c7e86c0c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1e83af992c94112a965575784639d77', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e88a960e-8540-4c69-934d-b6e1b91beb98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dc6df7d-3e57-4779-8232-af1ccf413403, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=e084df8c-a73e-4535-bcf7-de8adbafa9ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.294 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Jan 20 09:56:09 np0005588919 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007c.scope: Consumed 14.051s CPU time.
Jan 20 09:56:09 np0005588919 systemd-machined[194361]: Machine qemu-60-instance-0000007c terminated.
Jan 20 09:56:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:09.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:09 np0005588919 NetworkManager[49104]: <info>  [1768920969.4104] manager: (tape084df8c-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/225)
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.446 225859 INFO nova.virt.libvirt.driver [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Instance destroyed successfully.#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.447 225859 DEBUG nova.objects.instance [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lazy-loading 'resources' on Instance uuid 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.461 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920969.4608223, f6f09d34-bc44-451f-98e2-1b0701aeab3a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.461 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.467 225859 DEBUG nova.virt.libvirt.vif [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T14:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-913712707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-913712707',id=124,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEiICz3SRHAbg35RWQTkKKcPH7nuzl556rWhnnJMKWCQeHRtCTrp3rh0Cew3QLmsFdOqe88XbxeaMKtgT6L6nfvjZZnoyEjqVogiPNh8/V6NYBD5v71aQZWpX0o+tqsUvg==',key_name='tempest-keypair-1388890452',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:55:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b1e83af992c94112a965575784639d77',ramdisk_id='',reservation_id='r-n20fo39g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-896995479',owner_user_name='tempest-AttachVolumeShelveTestJSON-896995479-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:55:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='34eb73f628994c11801d447148d5f142',uuid=3bec73f6-5255-44c0-8a10-a64c7e86c0c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.467 225859 DEBUG nova.network.os_vif_util [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converting VIF {"id": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "address": "fa:16:3e:89:8e:0f", "network": {"id": "e9589011-b728-4b79-9945-aa6c52dd0fc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1143668360-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1e83af992c94112a965575784639d77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape084df8c-a7", "ovs_interfaceid": "e084df8c-a73e-4535-bcf7-de8adbafa9ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.468 225859 DEBUG nova.network.os_vif_util [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.468 225859 DEBUG os_vif [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.470 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.470 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape084df8c-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.474 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.476 225859 INFO os_vif [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:8e:0f,bridge_name='br-int',has_traffic_filtering=True,id=e084df8c-a73e-4535-bcf7-de8adbafa9ae,network=Network(e9589011-b728-4b79-9945-aa6c52dd0fc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape084df8c-a7')#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.527 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.532 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920969.4615161, f6f09d34-bc44-451f-98e2-1b0701aeab3a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.532 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.570 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.574 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:09 np0005588919 podman[276897]: 2026-01-20 14:56:09.610572349 +0000 UTC m=+0.049225921 container create 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:56:09 np0005588919 systemd[1]: Started libpod-conmon-17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e.scope.
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.663 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:09 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:56:09 np0005588919 podman[276897]: 2026-01-20 14:56:09.583895756 +0000 UTC m=+0.022549358 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:56:09 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79772ae235c75facde7fa71bd7324d0c967aed4152a0bb0f69655c6f1f473ade/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:56:09 np0005588919 podman[276897]: 2026-01-20 14:56:09.699239752 +0000 UTC m=+0.137893374 container init 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:56:09 np0005588919 podman[276897]: 2026-01-20 14:56:09.704939673 +0000 UTC m=+0.143593245 container start 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [NOTICE]   (276919) : New worker (276921) forked
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [NOTICE]   (276919) : Loading success.
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.764 140354 INFO neutron.agent.ovn.metadata.agent [-] Port e084df8c-a73e-4535-bcf7-de8adbafa9ae in datapath e9589011-b728-4b79-9945-aa6c52dd0fc2 unbound from our chassis#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.766 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9589011-b728-4b79-9945-aa6c52dd0fc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.767 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19d52ec5-81d5-450b-be03-a1c324dd5ba3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:09.767 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 namespace which is not needed anymore#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.830 225859 INFO nova.virt.libvirt.driver [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deleting instance files /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_del#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.831 225859 INFO nova.virt.libvirt.driver [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deletion of /var/lib/nova/instances/3bec73f6-5255-44c0-8a10-a64c7e86c0c2_del complete#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.873 225859 INFO nova.compute.manager [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.875 225859 DEBUG oslo.service.loopingcall [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.875 225859 DEBUG nova.compute.manager [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:56:09 np0005588919 nova_compute[225855]: 2026-01-20 14:56:09.876 225859 DEBUG nova.network.neutron [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [NOTICE]   (276301) : haproxy version is 2.8.14-c23fe91
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [NOTICE]   (276301) : path to executable is /usr/sbin/haproxy
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [WARNING]  (276301) : Exiting Master process...
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [ALERT]    (276301) : Current worker (276303) exited with code 143 (Terminated)
Jan 20 09:56:09 np0005588919 neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2[276297]: [WARNING]  (276301) : All workers exited. Exiting... (0)
Jan 20 09:56:09 np0005588919 systemd[1]: libpod-a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605.scope: Deactivated successfully.
Jan 20 09:56:09 np0005588919 podman[276947]: 2026-01-20 14:56:09.899083624 +0000 UTC m=+0.048026087 container died a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:56:09 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605-userdata-shm.mount: Deactivated successfully.
Jan 20 09:56:09 np0005588919 systemd[1]: var-lib-containers-storage-overlay-4b58477d32c8948a9385d790e9b6a62f6a46c01c4ace31c1af6ef64bffa12e02-merged.mount: Deactivated successfully.
Jan 20 09:56:09 np0005588919 podman[276947]: 2026-01-20 14:56:09.934483243 +0000 UTC m=+0.083425706 container cleanup a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 09:56:09 np0005588919 systemd[1]: libpod-conmon-a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605.scope: Deactivated successfully.
Jan 20 09:56:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:10.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:10 np0005588919 podman[276978]: 2026-01-20 14:56:10.014560654 +0000 UTC m=+0.054110399 container remove a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0facab-b37c-4a76-8e39-5d5044c87692]: (4, ('Tue Jan 20 02:56:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605)\na8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605\nTue Jan 20 02:56:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 (a8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605)\na8130d817215800fca2757ee75ba5302132a8e9fb828d8b29771e62e00d0d605\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.023 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bf120e-93d9-4036-8a66-a2cb54e85db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.024 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9589011-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:10 np0005588919 kernel: tape9589011-b0: left promiscuous mode
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.094 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.098 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa2f416-f4b7-4fd8-ab98-5022dc232fb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.108 225859 DEBUG nova.compute.manager [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.108 225859 DEBUG oslo_concurrency.lockutils [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.108 225859 DEBUG oslo_concurrency.lockutils [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.108 225859 DEBUG oslo_concurrency.lockutils [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.109 225859 DEBUG nova.compute.manager [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:10 np0005588919 nova_compute[225855]: 2026-01-20 14:56:10.109 225859 DEBUG nova.compute.manager [req-d2a7ed7b-f96d-40f6-9f59-e55da8679369 req-8128ee16-d2f8-437b-8d28-d8e337842f77 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-unplugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.114 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c94c045e-6059-4fa6-81b0-00dd47e01c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.115 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14d120b6-f798-4086-9bb9-cf31555e88ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.135 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d336ae9a-c3f3-48ad-813e-9afdb2c47c72]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598640, 'reachable_time': 25093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276991, 'error': None, 'target': 'ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:10 np0005588919 systemd[1]: run-netns-ovnmeta\x2de9589011\x2db728\x2d4b79\x2d9945\x2daa6c52dd0fc2.mount: Deactivated successfully.
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.138 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9589011-b728-4b79-9945-aa6c52dd0fc2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:56:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:10.138 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8347ea-c614-4d81-8a6e-8916e6cc3c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:11 np0005588919 nova_compute[225855]: 2026-01-20 14:56:11.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:11.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:12.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.241 225859 DEBUG nova.compute.manager [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.242 225859 DEBUG oslo_concurrency.lockutils [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.242 225859 DEBUG oslo_concurrency.lockutils [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.242 225859 DEBUG oslo_concurrency.lockutils [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.242 225859 DEBUG nova.compute.manager [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] No waiting events found dispatching network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.243 225859 WARNING nova.compute.manager [req-862ecea2-8ef5-4b05-8c98-6c78939b20b9 req-feeb1419-f667-44d3-8b2e-581a72a64428 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received unexpected event network-vif-plugged-e084df8c-a73e-4535-bcf7-de8adbafa9ae for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.386 225859 DEBUG nova.network.neutron [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.403 225859 INFO nova.compute.manager [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Took 2.53 seconds to deallocate network for instance.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.453 225859 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.454 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.454 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.455 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.455 225859 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Processing event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.455 225859 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.456 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.456 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.456 225859 DEBUG oslo_concurrency.lockutils [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.457 225859 DEBUG nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] No waiting events found dispatching network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.457 225859 WARNING nova.compute.manager [req-35fb815a-9fba-461a-8f8a-d244a36878e2 req-a835dafa-1400-44d5-98fb-517427180ca2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received unexpected event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.460 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.461 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.462 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.468 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920972.4680414, f6f09d34-bc44-451f-98e2-1b0701aeab3a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.468 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.478 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.487 225859 INFO nova.virt.libvirt.driver [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Instance spawned successfully.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.487 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:56:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.497 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.501 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.514 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.514 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.515 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.516 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.517 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.517 225859 DEBUG nova.virt.libvirt.driver [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.525 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.604 225859 DEBUG oslo_concurrency.processutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.634 225859 INFO nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Took 11.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:56:12 np0005588919 nova_compute[225855]: 2026-01-20 14:56:12.635 225859 DEBUG nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2494222947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.022 225859 INFO nova.compute.manager [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Took 12.32 seconds to build instance.#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.039 225859 DEBUG oslo_concurrency.lockutils [None req-f2371872-7b78-4e90-8e3f-234488572008 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.040 225859 DEBUG oslo_concurrency.processutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.050 225859 DEBUG nova.compute.provider_tree [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.069 225859 DEBUG nova.scheduler.client.report [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.090 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.129 225859 INFO nova.scheduler.client.report [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Deleted allocations for instance 3bec73f6-5255-44c0-8a10-a64c7e86c0c2#033[00m
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.364 225859 DEBUG oslo_concurrency.lockutils [None req-52e4d9fa-cf15-4d5d-9e86-eff0b02cb5ba 34eb73f628994c11801d447148d5f142 b1e83af992c94112a965575784639d77 - - default default] Lock "3bec73f6-5255-44c0-8a10-a64c7e86c0c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:13.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:13 np0005588919 nova_compute[225855]: 2026-01-20 14:56:13.662 225859 DEBUG nova.compute.manager [req-9a9499e4-e538-4c97-bf2c-20a7dc379c98 req-19e2c7a1-b434-4c13-a2d8-2c4149d98f94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Received event network-vif-deleted-e084df8c-a73e-4535-bcf7-de8adbafa9ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:14.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:14 np0005588919 podman[277066]: 2026-01-20 14:56:14.065833875 +0000 UTC m=+0.103325208 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:56:14 np0005588919 nova_compute[225855]: 2026-01-20 14:56:14.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:15.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:16.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:16 np0005588919 nova_compute[225855]: 2026-01-20 14:56:16.232 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:16.415 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:16.415 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:16.416 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.057 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.058 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.058 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.059 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.059 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.060 225859 INFO nova.compute.manager [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Terminating instance#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.062 225859 DEBUG nova.compute.manager [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:56:17 np0005588919 kernel: tap73ed9acf-a1 (unregistering): left promiscuous mode
Jan 20 09:56:17 np0005588919 NetworkManager[49104]: <info>  [1768920977.1182] device (tap73ed9acf-a1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:17Z|00522|binding|INFO|Releasing lport 73ed9acf-a178-4d9c-98a3-25f22489d41d from this chassis (sb_readonly=0)
Jan 20 09:56:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:17Z|00523|binding|INFO|Setting lport 73ed9acf-a178-4d9c-98a3-25f22489d41d down in Southbound
Jan 20 09:56:17 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:17Z|00524|binding|INFO|Removing iface tap73ed9acf-a1 ovn-installed in OVS
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.133 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:7a:7c 10.100.0.7'], port_security=['fa:16:3e:a5:7a:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6f09d34-bc44-451f-98e2-1b0701aeab3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4be5b75b5dcb4eeea9759f7c4a779ffa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6171706d-94c4-4c43-b4b2-ef4cbdfdf97c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf74ec16-3ea2-4ca6-9e5e-52ec9c203b9d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=73ed9acf-a178-4d9c-98a3-25f22489d41d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.134 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 73ed9acf-a178-4d9c-98a3-25f22489d41d in datapath d3dc1854-2a38-414a-a424-2ff753e5a7da unbound from our chassis#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.135 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d3dc1854-2a38-414a-a424-2ff753e5a7da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.136 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[60813d76-7797-4175-9f83-a8e621f205ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.137 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da namespace which is not needed anymore#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.145 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000081.scope: Deactivated successfully.
Jan 20 09:56:17 np0005588919 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000081.scope: Consumed 5.309s CPU time.
Jan 20 09:56:17 np0005588919 systemd-machined[194361]: Machine qemu-61-instance-00000081 terminated.
Jan 20 09:56:17 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [NOTICE]   (276919) : haproxy version is 2.8.14-c23fe91
Jan 20 09:56:17 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [NOTICE]   (276919) : path to executable is /usr/sbin/haproxy
Jan 20 09:56:17 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [WARNING]  (276919) : Exiting Master process...
Jan 20 09:56:17 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [ALERT]    (276919) : Current worker (276921) exited with code 143 (Terminated)
Jan 20 09:56:17 np0005588919 neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da[276915]: [WARNING]  (276919) : All workers exited. Exiting... (0)
Jan 20 09:56:17 np0005588919 systemd[1]: libpod-17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e.scope: Deactivated successfully.
Jan 20 09:56:17 np0005588919 podman[277118]: 2026-01-20 14:56:17.286823707 +0000 UTC m=+0.059829239 container died 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.302 225859 INFO nova.virt.libvirt.driver [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Instance destroyed successfully.#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.303 225859 DEBUG nova.objects.instance [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lazy-loading 'resources' on Instance uuid f6f09d34-bc44-451f-98e2-1b0701aeab3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e-userdata-shm.mount: Deactivated successfully.
Jan 20 09:56:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay-79772ae235c75facde7fa71bd7324d0c967aed4152a0bb0f69655c6f1f473ade-merged.mount: Deactivated successfully.
Jan 20 09:56:17 np0005588919 podman[277118]: 2026-01-20 14:56:17.335565623 +0000 UTC m=+0.108571135 container cleanup 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 09:56:17 np0005588919 systemd[1]: libpod-conmon-17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e.scope: Deactivated successfully.
Jan 20 09:56:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:17.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.399 225859 DEBUG nova.virt.libvirt.vif [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:55:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-505687879',display_name='tempest-ServerMetadataTestJSON-server-505687879',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-505687879',id=129,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4be5b75b5dcb4eeea9759f7c4a779ffa',ramdisk_id='',reservation_id='r-cbmgnji8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-599451381',owner_user_name='tempest-ServerMetadataTestJSON-599451381-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:56:16Z,user_data=None,user_id='f6f144f1d330427e82e84c891e9a8a89',uuid=f6f09d34-bc44-451f-98e2-1b0701aeab3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.400 225859 DEBUG nova.network.os_vif_util [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converting VIF {"id": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "address": "fa:16:3e:a5:7a:7c", "network": {"id": "d3dc1854-2a38-414a-a424-2ff753e5a7da", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-645777946-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4be5b75b5dcb4eeea9759f7c4a779ffa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ed9acf-a1", "ovs_interfaceid": "73ed9acf-a178-4d9c-98a3-25f22489d41d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.401 225859 DEBUG nova.network.os_vif_util [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.401 225859 DEBUG os_vif [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.403 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73ed9acf-a1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:17 np0005588919 podman[277160]: 2026-01-20 14:56:17.406426313 +0000 UTC m=+0.047948684 container remove 17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.433 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.435 225859 INFO os_vif [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:7a:7c,bridge_name='br-int',has_traffic_filtering=True,id=73ed9acf-a178-4d9c-98a3-25f22489d41d,network=Network(d3dc1854-2a38-414a-a424-2ff753e5a7da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ed9acf-a1')#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.436 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f8392c6b-3d4a-487a-8258-bce73d9cded1]: (4, ('Tue Jan 20 02:56:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da (17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e)\n17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e\nTue Jan 20 02:56:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da (17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e)\n17702d2dea4cfe79c035f6a1cbf056da9364208b58b99e46738aa45a65e6f66e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.439 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb9a3ce-6c3f-4f90-ba5e-cf32d29ce676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.440 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3dc1854-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:17 np0005588919 kernel: tapd3dc1854-20: left promiscuous mode
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.458 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.463 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e93d2a-c9bd-411c-8999-1ef3bb980515]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.484 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[daa65bbf-c95f-4f41-be2f-3bc25352120b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.485 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb6c80d-1a96-4012-95d0-a57342d037e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.503 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4ff84a-41b0-4c4f-9a7f-81c9fb7b027a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600791, 'reachable_time': 17265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277193, 'error': None, 'target': 'ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.505 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d3dc1854-2a38-414a-a424-2ff753e5a7da deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:56:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:17.506 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed8882d-aa4a-49db-8583-aa83060fef30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:17 np0005588919 systemd[1]: run-netns-ovnmeta\x2dd3dc1854\x2d2a38\x2d414a\x2da424\x2d2ff753e5a7da.mount: Deactivated successfully.
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.827 225859 INFO nova.virt.libvirt.driver [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Deleting instance files /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a_del#033[00m
Jan 20 09:56:17 np0005588919 nova_compute[225855]: 2026-01-20 14:56:17.829 225859 INFO nova.virt.libvirt.driver [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Deletion of /var/lib/nova/instances/f6f09d34-bc44-451f-98e2-1b0701aeab3a_del complete#033[00m
Jan 20 09:56:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.030 225859 INFO nova.compute.manager [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Took 0.97 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.031 225859 DEBUG oslo.service.loopingcall [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.031 225859 DEBUG nova.compute.manager [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.032 225859 DEBUG nova.network.neutron [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.664 225859 DEBUG nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-unplugged-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.664 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.665 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.665 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.665 225859 DEBUG nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] No waiting events found dispatching network-vif-unplugged-73ed9acf-a178-4d9c-98a3-25f22489d41d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.665 225859 DEBUG nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-unplugged-73ed9acf-a178-4d9c-98a3-25f22489d41d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.665 225859 DEBUG nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.666 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.666 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.666 225859 DEBUG oslo_concurrency.lockutils [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.666 225859 DEBUG nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] No waiting events found dispatching network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:18 np0005588919 nova_compute[225855]: 2026-01-20 14:56:18.666 225859 WARNING nova.compute.manager [req-0f4035ba-b83f-4459-867a-94982d716b05 req-7b02a496-dfb6-40b8-bb95-e62a3cd6b7c3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received unexpected event network-vif-plugged-73ed9acf-a178-4d9c-98a3-25f22489d41d for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.109 225859 DEBUG nova.network.neutron [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.221 225859 DEBUG nova.compute.manager [req-81172894-9c40-473f-9415-1e65f32394b5 req-9b0e9c94-fa57-4734-94cd-5f50387e025f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Received event network-vif-deleted-73ed9acf-a178-4d9c-98a3-25f22489d41d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.222 225859 INFO nova.compute.manager [req-81172894-9c40-473f-9415-1e65f32394b5 req-9b0e9c94-fa57-4734-94cd-5f50387e025f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Neutron deleted interface 73ed9acf-a178-4d9c-98a3-25f22489d41d; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.222 225859 DEBUG nova.network.neutron [req-81172894-9c40-473f-9415-1e65f32394b5 req-9b0e9c94-fa57-4734-94cd-5f50387e025f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.244 225859 INFO nova.compute.manager [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Took 1.21 seconds to deallocate network for instance.#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.257 225859 DEBUG nova.compute.manager [req-81172894-9c40-473f-9415-1e65f32394b5 req-9b0e9c94-fa57-4734-94cd-5f50387e025f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Detach interface failed, port_id=73ed9acf-a178-4d9c-98a3-25f22489d41d, reason: Instance f6f09d34-bc44-451f-98e2-1b0701aeab3a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 09:56:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:19.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.412 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.413 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.472 225859 DEBUG oslo_concurrency.processutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3206897350' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.895 225859 DEBUG oslo_concurrency.processutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.900 225859 DEBUG nova.compute.provider_tree [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.918 225859 DEBUG nova.scheduler.client.report [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:19 np0005588919 nova_compute[225855]: 2026-01-20 14:56:19.980 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:20 np0005588919 nova_compute[225855]: 2026-01-20 14:56:20.012 225859 INFO nova.scheduler.client.report [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Deleted allocations for instance f6f09d34-bc44-451f-98e2-1b0701aeab3a#033[00m
Jan 20 09:56:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:20 np0005588919 nova_compute[225855]: 2026-01-20 14:56:20.082 225859 DEBUG oslo_concurrency.lockutils [None req-de8ab48f-f90f-41dc-bd39-b1222b6e5a67 f6f144f1d330427e82e84c891e9a8a89 4be5b75b5dcb4eeea9759f7c4a779ffa - - default default] Lock "f6f09d34-bc44-451f-98e2-1b0701aeab3a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:20 np0005588919 nova_compute[225855]: 2026-01-20 14:56:20.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:21 np0005588919 nova_compute[225855]: 2026-01-20 14:56:21.076 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:21 np0005588919 nova_compute[225855]: 2026-01-20 14:56:21.231 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:21.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:22.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:22 np0005588919 nova_compute[225855]: 2026-01-20 14:56:22.432 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:23.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:24.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:24 np0005588919 nova_compute[225855]: 2026-01-20 14:56:24.431 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920969.4307125, 3bec73f6-5255-44c0-8a10-a64c7e86c0c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:24 np0005588919 nova_compute[225855]: 2026-01-20 14:56:24.432 225859 INFO nova.compute.manager [-] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:56:24 np0005588919 nova_compute[225855]: 2026-01-20 14:56:24.454 225859 DEBUG nova.compute.manager [None req-98c1837b-e595-4558-9b3f-be532e154f51 - - - - - -] [instance: 3bec73f6-5255-44c0-8a10-a64c7e86c0c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:25.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:56:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:56:26 np0005588919 podman[277355]: 2026-01-20 14:56:26.012148394 +0000 UTC m=+0.054754087 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 20 09:56:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:26 np0005588919 nova_compute[225855]: 2026-01-20 14:56:26.234 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:27.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.500 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.501 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.531 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.640 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.640 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.646 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.646 225859 INFO nova.compute.claims [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:56:27 np0005588919 nova_compute[225855]: 2026-01-20 14:56:27.762 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:28.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1100107449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.173 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.179 225859 DEBUG nova.compute.provider_tree [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.196 225859 DEBUG nova.scheduler.client.report [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.228 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.229 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.286 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.287 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.311 225859 INFO nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.328 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.418 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.419 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.420 225859 INFO nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Creating image(s)#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.446 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.473 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.499 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.502 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.562 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.563 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.564 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.564 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.589 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.593 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e32ecf59-145a-4ae9-a91e-288419407cd0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.913 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 e32ecf59-145a-4ae9-a91e-288419407cd0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:28 np0005588919 nova_compute[225855]: 2026-01-20 14:56:28.995 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] resizing rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.097 225859 DEBUG nova.objects.instance [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'migration_context' on Instance uuid e32ecf59-145a-4ae9-a91e-288419407cd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.126 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.127 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Ensure instance console log exists: /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.127 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.127 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.128 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:29 np0005588919 nova_compute[225855]: 2026-01-20 14:56:29.298 225859 DEBUG nova.policy [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '395a5c503218411284bc94c45263d1fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:56:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:30.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:31 np0005588919 nova_compute[225855]: 2026-01-20 14:56:31.236 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:31.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:31 np0005588919 nova_compute[225855]: 2026-01-20 14:56:31.565 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Successfully created port: 5909a21f-c1fb-4265-a7de-a6b0e6136194 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:56:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:32.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.301 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768920977.2993426, f6f09d34-bc44-451f-98e2-1b0701aeab3a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.301 225859 INFO nova.compute.manager [-] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.324 225859 DEBUG nova.compute.manager [None req-52c34ae0-8357-49ec-b3f8-749e5b63a289 - - - - - -] [instance: f6f09d34-bc44-451f-98e2-1b0701aeab3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.415 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Successfully updated port: 5909a21f-c1fb-4265-a7de-a6b0e6136194 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.434 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.434 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquired lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.435 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.548 225859 DEBUG nova.compute.manager [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-changed-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.549 225859 DEBUG nova.compute.manager [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Refreshing instance network info cache due to event network-changed-5909a21f-c1fb-4265-a7de-a6b0e6136194. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.549 225859 DEBUG oslo_concurrency.lockutils [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:32 np0005588919 nova_compute[225855]: 2026-01-20 14:56:32.851 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:56:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:33.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.720 225859 DEBUG nova.network.neutron [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updating instance_info_cache with network_info: [{"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.749 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Releasing lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.749 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Instance network_info: |[{"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.750 225859 DEBUG oslo_concurrency.lockutils [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.750 225859 DEBUG nova.network.neutron [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Refreshing network info cache for port 5909a21f-c1fb-4265-a7de-a6b0e6136194 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.753 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Start _get_guest_xml network_info=[{"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.758 225859 WARNING nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.762 225859 DEBUG nova.virt.libvirt.host [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.762 225859 DEBUG nova.virt.libvirt.host [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.770 225859 DEBUG nova.virt.libvirt.host [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.771 225859 DEBUG nova.virt.libvirt.host [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.772 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.772 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.772 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.773 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.773 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.773 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.773 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.774 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.774 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.774 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.774 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.775 225859 DEBUG nova.virt.hardware [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:56:33 np0005588919 nova_compute[225855]: 2026-01-20 14:56:33.777 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2907952302' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:34 np0005588919 nova_compute[225855]: 2026-01-20 14:56:34.251 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:34 np0005588919 nova_compute[225855]: 2026-01-20 14:56:34.279 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:34 np0005588919 nova_compute[225855]: 2026-01-20 14:56:34.284 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:35.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2104464304' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.811 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.815 225859 DEBUG nova.virt.libvirt.vif [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=132,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-iv93ouga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:28Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=e32ecf59-145a-4ae9-a91e-288419407cd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.815 225859 DEBUG nova.network.os_vif_util [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.817 225859 DEBUG nova.network.os_vif_util [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.819 225859 DEBUG nova.objects.instance [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'pci_devices' on Instance uuid e32ecf59-145a-4ae9-a91e-288419407cd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.841 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <uuid>e32ecf59-145a-4ae9-a91e-288419407cd0</uuid>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <name>instance-00000084</name>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersTestJSON-server-1537565903</nova:name>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:56:33</nova:creationTime>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:user uuid="395a5c503218411284bc94c45263d1fb">tempest-ServersTestJSON-405461620-project-member</nova:user>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:project uuid="ca6cd0afe0ab41e3ab36d21a4129f734">tempest-ServersTestJSON-405461620</nova:project>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <nova:port uuid="5909a21f-c1fb-4265-a7de-a6b0e6136194">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="serial">e32ecf59-145a-4ae9-a91e-288419407cd0</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="uuid">e32ecf59-145a-4ae9-a91e-288419407cd0</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/e32ecf59-145a-4ae9-a91e-288419407cd0_disk">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:ac:7a:cf"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <target dev="tap5909a21f-c1"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/console.log" append="off"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:56:35 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:56:35 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:56:35 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:56:35 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.843 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Preparing to wait for external event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.844 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.845 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.845 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.847 225859 DEBUG nova.virt.libvirt.vif [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=132,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-iv93ouga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:28Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=e32ecf59-145a-4ae9-a91e-288419407cd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.847 225859 DEBUG nova.network.os_vif_util [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.848 225859 DEBUG nova.network.os_vif_util [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.849 225859 DEBUG os_vif [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.850 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.851 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.852 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.856 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.856 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5909a21f-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.857 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5909a21f-c1, col_values=(('external_ids', {'iface-id': '5909a21f-c1fb-4265-a7de-a6b0e6136194', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:7a:cf', 'vm-uuid': 'e32ecf59-145a-4ae9-a91e-288419407cd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:35 np0005588919 NetworkManager[49104]: <info>  [1768920995.8592] manager: (tap5909a21f-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.866 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.866 225859 INFO os_vif [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1')#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.930 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.931 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.931 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No VIF found with MAC fa:16:3e:ac:7a:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.932 225859 INFO nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Using config drive#033[00m
Jan 20 09:56:35 np0005588919 nova_compute[225855]: 2026-01-20 14:56:35.953 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:36.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.238 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.248 225859 DEBUG nova.network.neutron [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updated VIF entry in instance network info cache for port 5909a21f-c1fb-4265-a7de-a6b0e6136194. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.248 225859 DEBUG nova.network.neutron [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updating instance_info_cache with network_info: [{"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.264 225859 DEBUG oslo_concurrency.lockutils [req-663c50a3-d523-449b-a9bb-b5dfb198dff5 req-dcb08139-88cf-441d-a9f5-3baa9f6754bb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.527 225859 INFO nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Creating config drive at /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.532 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjv1s2hi7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.661 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjv1s2hi7" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.693 225859 DEBUG nova.storage.rbd_utils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.697 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.884 225859 DEBUG oslo_concurrency.processutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config e32ecf59-145a-4ae9-a91e-288419407cd0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.885 225859 INFO nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Deleting local config drive /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0/disk.config because it was imported into RBD.#033[00m
Jan 20 09:56:36 np0005588919 kernel: tap5909a21f-c1: entered promiscuous mode
Jan 20 09:56:36 np0005588919 NetworkManager[49104]: <info>  [1768920996.9416] manager: (tap5909a21f-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.942 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:36Z|00525|binding|INFO|Claiming lport 5909a21f-c1fb-4265-a7de-a6b0e6136194 for this chassis.
Jan 20 09:56:36 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:36Z|00526|binding|INFO|5909a21f-c1fb-4265-a7de-a6b0e6136194: Claiming fa:16:3e:ac:7a:cf 10.100.0.13
Jan 20 09:56:36 np0005588919 nova_compute[225855]: 2026-01-20 14:56:36.948 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.957 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:7a:cf 10.100.0.13'], port_security=['fa:16:3e:ac:7a:cf 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e32ecf59-145a-4ae9-a91e-288419407cd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '2', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5909a21f-c1fb-4265-a7de-a6b0e6136194) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.958 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5909a21f-c1fb-4265-a7de-a6b0e6136194 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c bound to our chassis#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.960 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.970 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[335daa75-0b3f-4ff8-a760-3b065b95333b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.971 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4c8474b-01 in ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.973 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4c8474b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.973 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3e64f09f-14f7-4773-806b-86edc654ff43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.974 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bd86d5d3-3de2-4e69-a5eb-22491f1e0957]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:36 np0005588919 systemd-udevd[277803]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:36 np0005588919 systemd-machined[194361]: New machine qemu-62-instance-00000084.
Jan 20 09:56:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:36.989 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4e8523-48ac-42aa-aff0-39f0adaadf4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:36 np0005588919 NetworkManager[49104]: <info>  [1768920996.9940] device (tap5909a21f-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:56:36 np0005588919 NetworkManager[49104]: <info>  [1768920996.9952] device (tap5909a21f-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:56:37 np0005588919 systemd[1]: Started Virtual Machine qemu-62-instance-00000084.
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.014 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ca03829a-4709-41ad-bd9f-1425f0197666]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.015 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.017 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:37Z|00527|binding|INFO|Setting lport 5909a21f-c1fb-4265-a7de-a6b0e6136194 ovn-installed in OVS
Jan 20 09:56:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:37Z|00528|binding|INFO|Setting lport 5909a21f-c1fb-4265-a7de-a6b0e6136194 up in Southbound
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.042 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ec49f21f-052e-4725-ad5b-791bf1ce8799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 NetworkManager[49104]: <info>  [1768920997.0480] manager: (tapf4c8474b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Jan 20 09:56:37 np0005588919 systemd-udevd[277806]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.047 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c29af9aa-33ae-430d-ad03-2b48f63b1fd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.077 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f3e5db-0a0b-45ed-99df-6f1f3dd7ac11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.079 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[df7965af-d284-421d-b240-bee400b800fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 NetworkManager[49104]: <info>  [1768920997.1051] device (tapf4c8474b-00): carrier: link connected
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.111 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e737b030-5159-4c3a-a0f9-e1d46851c44c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.125 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7844324f-2e8d-4d3a-9c15-9fbc88f9226e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603603, 'reachable_time': 28670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277835, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.142 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ac3c1e-6a3e-4bbd-979a-b82de0e33bf2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:a25f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603603, 'tstamp': 603603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277836, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.156 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e7147c71-b40c-41dc-915d-712f1b604476]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603603, 'reachable_time': 28670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277837, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.183 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ec664a11-75b3-423a-bf2a-1bc4f332e3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.252 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[13c69e15-ad36-4875-b772-65313f766cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.254 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.254 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.254 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4c8474b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.258 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 NetworkManager[49104]: <info>  [1768920997.2594] manager: (tapf4c8474b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Jan 20 09:56:37 np0005588919 kernel: tapf4c8474b-00: entered promiscuous mode
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.261 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4c8474b-00, col_values=(('external_ids', {'iface-id': '8c6fd3ab-70a8-4e63-99de-f2e15ac0207f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:37 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:37Z|00529|binding|INFO|Releasing lport 8c6fd3ab-70a8-4e63-99de-f2e15ac0207f from this chassis (sb_readonly=0)
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.267 225859 DEBUG nova.compute.manager [req-6b8bd7ce-1fa2-4c23-ad25-133b77685826 req-fe0ec297-5c52-480b-baf1-3dbc6be9bb2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.268 225859 DEBUG oslo_concurrency.lockutils [req-6b8bd7ce-1fa2-4c23-ad25-133b77685826 req-fe0ec297-5c52-480b-baf1-3dbc6be9bb2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.269 225859 DEBUG oslo_concurrency.lockutils [req-6b8bd7ce-1fa2-4c23-ad25-133b77685826 req-fe0ec297-5c52-480b-baf1-3dbc6be9bb2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.269 225859 DEBUG oslo_concurrency.lockutils [req-6b8bd7ce-1fa2-4c23-ad25-133b77685826 req-fe0ec297-5c52-480b-baf1-3dbc6be9bb2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.270 225859 DEBUG nova.compute.manager [req-6b8bd7ce-1fa2-4c23-ad25-133b77685826 req-fe0ec297-5c52-480b-baf1-3dbc6be9bb2f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Processing event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.297 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.297 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.298 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3f2f228a-ecc3-418c-a5e4-0adbd69ee42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.299 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.pid.haproxy
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.300 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'env', 'PROCESS_TAG=haproxy-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:56:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:37.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.693 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920997.6930447, e32ecf59-145a-4ae9-a91e-288419407cd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:37 np0005588919 podman[277911]: 2026-01-20 14:56:37.694673125 +0000 UTC m=+0.057923616 container create 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.694 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] VM Started (Lifecycle Event)#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.698 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.704 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.708 225859 INFO nova.virt.libvirt.driver [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Instance spawned successfully.#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.708 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.729 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:37 np0005588919 systemd[1]: Started libpod-conmon-998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250.scope.
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.736 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.739 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.740 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.740 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.741 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.741 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.742 225859 DEBUG nova.virt.libvirt.driver [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:37 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:56:37 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6beda3a917b01666983f6717d8a15faa248a8e035acb78285162928c0b4a3550/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:56:37 np0005588919 podman[277911]: 2026-01-20 14:56:37.66438391 +0000 UTC m=+0.027634441 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:56:37 np0005588919 podman[277911]: 2026-01-20 14:56:37.769972431 +0000 UTC m=+0.133222942 container init 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 09:56:37 np0005588919 podman[277911]: 2026-01-20 14:56:37.775449115 +0000 UTC m=+0.138699606 container start 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.778 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.779 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920997.694136, e32ecf59-145a-4ae9-a91e-288419407cd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.779 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:56:37 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [NOTICE]   (277932) : New worker (277934) forked
Jan 20 09:56:37 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [NOTICE]   (277932) : Loading success.
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.814 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.817 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768920997.7039716, e32ecf59-145a-4ae9-a91e-288419407cd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.817 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.826 225859 INFO nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Took 9.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.827 225859 DEBUG nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:37.838 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.839 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.841 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.871 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.894 225859 INFO nova.compute.manager [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Took 10.28 seconds to build instance.#033[00m
Jan 20 09:56:37 np0005588919 nova_compute[225855]: 2026-01-20 14:56:37.915 225859 DEBUG oslo_concurrency.lockutils [None req-dace9bde-26dd-4f65-90a1-a36f1180201a 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:38.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:38 np0005588919 nova_compute[225855]: 2026-01-20 14:56:38.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.394 225859 DEBUG nova.compute.manager [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.395 225859 DEBUG oslo_concurrency.lockutils [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.395 225859 DEBUG oslo_concurrency.lockutils [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.395 225859 DEBUG oslo_concurrency.lockutils [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.395 225859 DEBUG nova.compute.manager [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] No waiting events found dispatching network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:39 np0005588919 nova_compute[225855]: 2026-01-20 14:56:39.395 225859 WARNING nova.compute.manager [req-efb092f2-2234-4ee4-afa5-811d6b24b09f req-82e8dd03-24b1-4753-afcd-35f8df64e394 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received unexpected event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 for instance with vm_state active and task_state None.#033[00m
Jan 20 09:56:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:39.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:40.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:56:40 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:56:40 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.502 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.503 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.503 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.503 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e32ecf59-145a-4ae9-a91e-288419407cd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:40 np0005588919 nova_compute[225855]: 2026-01-20 14:56:40.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.241 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:56:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:41.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.452 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.452 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.473 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.474 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.476 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.506 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.578 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.578 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.588 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.589 225859 INFO nova.compute.claims [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.605 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:41 np0005588919 nova_compute[225855]: 2026-01-20 14:56:41.738 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.061 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updating instance_info_cache with network_info: [{"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.089 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-e32ecf59-145a-4ae9-a91e-288419407cd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.089 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.090 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1147169689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.180 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.185 225859 DEBUG nova.compute.provider_tree [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.199 225859 DEBUG nova.scheduler.client.report [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.221 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.222 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.224 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.230 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.230 225859 INFO nova.compute.claims [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.292 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.293 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.315 225859 INFO nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.340 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.381 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.459 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.461 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.461 225859 INFO nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Creating image(s)#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.485 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.512 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.546 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.550 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.574 225859 DEBUG nova.policy [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '168ca7898b964a44b76c90912fa89a66', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d4e37f4fd7f4dbbb25648ec639e0e43', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.610 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.611 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.611 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.612 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.634 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.637 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/394105736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.856 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.862 225859 DEBUG nova.compute.provider_tree [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.884 225859 DEBUG nova.scheduler.client.report [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.903 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.904 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.971 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.971 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:56:42 np0005588919 nova_compute[225855]: 2026-01-20 14:56:42.990 225859 INFO nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.007 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.090 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.092 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.092 225859 INFO nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Creating image(s)#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.115 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.140 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.167 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.170 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.228 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.229 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.229 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.230 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.261 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.265 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3426109c-5671-4cc7-89b6-fea13983f921_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.288 225859 DEBUG nova.policy [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '395a5c503218411284bc94c45263d1fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.295 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Successfully created port: 6550efe7-7235-437c-b9f3-728b676371ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:56:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:43.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.801 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3426109c-5671-4cc7-89b6-fea13983f921_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:43 np0005588919 nova_compute[225855]: 2026-01-20 14:56:43.918 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] resizing rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:56:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.286 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.649s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.431 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] resizing rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.479 225859 DEBUG nova.objects.instance [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'migration_context' on Instance uuid 3426109c-5671-4cc7-89b6-fea13983f921 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.493 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.494 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Ensure instance console log exists: /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.494 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.495 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.495 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.746 225859 DEBUG nova.objects.instance [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'migration_context' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.770 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.770 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Ensure instance console log exists: /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.771 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.771 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:44 np0005588919 nova_compute[225855]: 2026-01-20 14:56:44.771 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:45 np0005588919 podman[278323]: 2026-01-20 14:56:45.061839777 +0000 UTC m=+0.100295190 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.211 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Successfully created port: d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:45.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.627 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Successfully updated port: 6550efe7-7235-437c-b9f3-728b676371ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.650 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.651 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.652 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:45 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.911 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:45.999 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Successfully updated port: d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.016 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.017 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquired lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.018 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.173 225859 DEBUG nova.compute.manager [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-changed-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.174 225859 DEBUG nova.compute.manager [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Refreshing instance network info cache due to event network-changed-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.174 225859 DEBUG oslo_concurrency.lockutils [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.242 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.261 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.388 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.388 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.389 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.390 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3207207858' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.863 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.928 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.929 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.980 225859 DEBUG nova.compute.manager [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-changed-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.981 225859 DEBUG nova.compute.manager [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing instance network info cache due to event network-changed-6550efe7-7235-437c-b9f3-728b676371ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:56:46 np0005588919 nova_compute[225855]: 2026-01-20 14:56:46.981 225859 DEBUG oslo_concurrency.lockutils [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.058 225859 DEBUG nova.network.neutron [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.073 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.075 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4201MB free_disk=20.781208038330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.075 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.075 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.090 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.090 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance network_info: |[{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.091 225859 DEBUG oslo_concurrency.lockutils [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.091 225859 DEBUG nova.network.neutron [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.093 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start _get_guest_xml network_info=[{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.099 225859 WARNING nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.103 225859 DEBUG nova.virt.libvirt.host [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.104 225859 DEBUG nova.virt.libvirt.host [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.109 225859 DEBUG nova.virt.libvirt.host [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.110 225859 DEBUG nova.virt.libvirt.host [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.111 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.111 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.111 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.112 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.112 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.112 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.113 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.113 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.113 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.113 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.114 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.114 225859 DEBUG nova.virt.hardware [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.117 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.197 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance e32ecf59-145a-4ae9-a91e-288419407cd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.197 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 538fe1f0-b666-4b97-b2ef-317adae0a47a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.198 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 3426109c-5671-4cc7-89b6-fea13983f921 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.198 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.198 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.301 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:47.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1351465299' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.581 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.616 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.621 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.685 225859 DEBUG nova.network.neutron [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Updating instance_info_cache with network_info: [{"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.711 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Releasing lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.713 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Instance network_info: |[{"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.714 225859 DEBUG oslo_concurrency.lockutils [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.714 225859 DEBUG nova.network.neutron [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Refreshing network info cache for port d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.717 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Start _get_guest_xml network_info=[{"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.721 225859 WARNING nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.727 225859 DEBUG nova.virt.libvirt.host [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.728 225859 DEBUG nova.virt.libvirt.host [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.730 225859 DEBUG nova.virt.libvirt.host [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.731 225859 DEBUG nova.virt.libvirt.host [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.732 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.732 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.733 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.733 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.734 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.734 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.734 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.734 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.735 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.735 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:56:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.735 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:56:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4132156085' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.736 225859 DEBUG nova.virt.hardware [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.740 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.772 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.778 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.795 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.813 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:56:47 np0005588919 nova_compute[225855]: 2026-01-20 14:56:47.814 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:47.841 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:48.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1142049157' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.092 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.094 225859 DEBUG nova.virt.libvirt.vif [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-430397789',display_name='tempest-ServerRescueTestJSONUnderV235-server-430397789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-430397789',id=134,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d4e37f4fd7f4dbbb25648ec639e0e43',ramdisk_id='',reservation_id='r-xjy771y2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-201664875',owner_user_name='tempest-ServerRescueTestJSONUnderV235-201664875-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:42Z,user_data=None,user_id='168ca7898b964a44b76c90912fa89a66',uuid=538fe1f0-b666-4b97-b2ef-317adae0a47a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.094 225859 DEBUG nova.network.os_vif_util [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converting VIF {"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.095 225859 DEBUG nova.network.os_vif_util [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.096 225859 DEBUG nova.objects.instance [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'pci_devices' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.113 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <uuid>538fe1f0-b666-4b97-b2ef-317adae0a47a</uuid>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <name>instance-00000086</name>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-430397789</nova:name>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:56:47</nova:creationTime>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:user uuid="168ca7898b964a44b76c90912fa89a66">tempest-ServerRescueTestJSONUnderV235-201664875-project-member</nova:user>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:project uuid="4d4e37f4fd7f4dbbb25648ec639e0e43">tempest-ServerRescueTestJSONUnderV235-201664875</nova:project>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:port uuid="6550efe7-7235-437c-b9f3-728b676371ee">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="serial">538fe1f0-b666-4b97-b2ef-317adae0a47a</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="uuid">538fe1f0-b666-4b97-b2ef-317adae0a47a</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/538fe1f0-b666-4b97-b2ef-317adae0a47a_disk">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:e3:4f:ce"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="tap6550efe7-72"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/console.log" append="off"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:56:48 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:56:48 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.115 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Preparing to wait for external event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.115 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.115 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.115 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.116 225859 DEBUG nova.virt.libvirt.vif [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-430397789',display_name='tempest-ServerRescueTestJSONUnderV235-server-430397789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-430397789',id=134,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d4e37f4fd7f4dbbb25648ec639e0e43',ramdisk_id='',reservation_id='r-xjy771y2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-201664875',owner_user_name='tempest-ServerRescueTestJSONUnderV235-201664875-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:42Z,user_data=None,user_id='168ca7898b964a44b76c90912fa89a66',uuid=538fe1f0-b666-4b97-b2ef-317adae0a47a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.116 225859 DEBUG nova.network.os_vif_util [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converting VIF {"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.117 225859 DEBUG nova.network.os_vif_util [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.117 225859 DEBUG os_vif [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.118 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.118 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.121 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6550efe7-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.122 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6550efe7-72, col_values=(('external_ids', {'iface-id': '6550efe7-7235-437c-b9f3-728b676371ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:4f:ce', 'vm-uuid': '538fe1f0-b666-4b97-b2ef-317adae0a47a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 NetworkManager[49104]: <info>  [1768921008.1240] manager: (tap6550efe7-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.127 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.131 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.132 225859 INFO os_vif [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72')#033[00m
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.189 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.189 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.190 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No VIF found with MAC fa:16:3e:e3:4f:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.190 225859 INFO nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Using config drive#033[00m
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4275779399' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.215 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.225 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.251 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.256 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:56:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4266035393' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.680 225859 INFO nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Creating config drive at /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.685 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdsj22l2k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.704 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.706 225859 DEBUG nova.virt.libvirt.vif [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=135,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-9isslkfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:43Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3426109c-5671-4cc7-89b6-fea13983f921,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.706 225859 DEBUG nova.network.os_vif_util [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.707 225859 DEBUG nova.network.os_vif_util [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.708 225859 DEBUG nova.objects.instance [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3426109c-5671-4cc7-89b6-fea13983f921 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.724 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <uuid>3426109c-5671-4cc7-89b6-fea13983f921</uuid>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <name>instance-00000087</name>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServersTestJSON-server-1537565903</nova:name>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:56:47</nova:creationTime>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:user uuid="395a5c503218411284bc94c45263d1fb">tempest-ServersTestJSON-405461620-project-member</nova:user>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:project uuid="ca6cd0afe0ab41e3ab36d21a4129f734">tempest-ServersTestJSON-405461620</nova:project>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <nova:port uuid="d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="serial">3426109c-5671-4cc7-89b6-fea13983f921</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="uuid">3426109c-5671-4cc7-89b6-fea13983f921</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3426109c-5671-4cc7-89b6-fea13983f921_disk">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3426109c-5671-4cc7-89b6-fea13983f921_disk.config">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:4f:e1:78"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <target dev="tapd93a212a-0f"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/console.log" append="off"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:56:48 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:56:48 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:56:48 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:56:48 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.725 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Preparing to wait for external event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.725 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.725 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.725 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.726 225859 DEBUG nova.virt.libvirt.vif [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=135,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-9isslkfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:43Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3426109c-5671-4cc7-89b6-fea13983f921,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.726 225859 DEBUG nova.network.os_vif_util [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.727 225859 DEBUG nova.network.os_vif_util [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.727 225859 DEBUG os_vif [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.728 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.728 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.728 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.731 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.731 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd93a212a-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.731 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd93a212a-0f, col_values=(('external_ids', {'iface-id': 'd93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:e1:78', 'vm-uuid': '3426109c-5671-4cc7-89b6-fea13983f921'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:48 np0005588919 NetworkManager[49104]: <info>  [1768921008.7695] manager: (tapd93a212a-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.768 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.771 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.775 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.776 225859 INFO os_vif [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f')#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.812 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdsj22l2k" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.835 225859 DEBUG nova.storage.rbd_utils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.837 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.868 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.869 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.869 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] No VIF found with MAC fa:16:3e:4f:e1:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.869 225859 INFO nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Using config drive#033[00m
Jan 20 09:56:48 np0005588919 nova_compute[225855]: 2026-01-20 14:56:48.898 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.293 225859 DEBUG oslo_concurrency.processutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.294 225859 INFO nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Deleting local config drive /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config because it was imported into RBD.#033[00m
Jan 20 09:56:49 np0005588919 kernel: tap6550efe7-72: entered promiscuous mode
Jan 20 09:56:49 np0005588919 NetworkManager[49104]: <info>  [1768921009.3421] manager: (tap6550efe7-72): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 20 09:56:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:49Z|00530|binding|INFO|Claiming lport 6550efe7-7235-437c-b9f3-728b676371ee for this chassis.
Jan 20 09:56:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:49Z|00531|binding|INFO|6550efe7-7235-437c-b9f3-728b676371ee: Claiming fa:16:3e:e3:4f:ce 10.100.0.3
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.346 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:49.358 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:4f:ce 10.100.0.3'], port_security=['fa:16:3e:e3:4f:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '538fe1f0-b666-4b97-b2ef-317adae0a47a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87caaa2e-d899-4eed-8b6a-8d19125c693b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4e37f4fd7f4dbbb25648ec639e0e43', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16f3c0d4-753e-4c8b-b00a-7073cbcfa6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be68bfdf-b1f2-46c8-82b2-2c275774a706, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6550efe7-7235-437c-b9f3-728b676371ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:49.360 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6550efe7-7235-437c-b9f3-728b676371ee in datapath 87caaa2e-d899-4eed-8b6a-8d19125c693b bound to our chassis#033[00m
Jan 20 09:56:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:49.361 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 87caaa2e-d899-4eed-8b6a-8d19125c693b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 09:56:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:49.362 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a6496691-77eb-456c-ac4b-08a2c3aea3b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:49 np0005588919 systemd-machined[194361]: New machine qemu-63-instance-00000086.
Jan 20 09:56:49 np0005588919 systemd[1]: Started Virtual Machine qemu-63-instance-00000086.
Jan 20 09:56:49 np0005588919 systemd-udevd[278617]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:56:49 np0005588919 NetworkManager[49104]: <info>  [1768921009.4235] device (tap6550efe7-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:56:49 np0005588919 NetworkManager[49104]: <info>  [1768921009.4244] device (tap6550efe7-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.430 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:49Z|00532|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee ovn-installed in OVS
Jan 20 09:56:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:49Z|00533|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee up in Southbound
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:49.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.498 225859 INFO nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Creating config drive at /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.510 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4kqx6r1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.593 225859 DEBUG nova.network.neutron [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updated VIF entry in instance network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.594 225859 DEBUG nova.network.neutron [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.612 225859 DEBUG oslo_concurrency.lockutils [req-6ac12c71-92f6-4b79-9191-cc43d7ba390a req-c7771993-1790-4c03-ba46-12fe19c55066 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.653 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4kqx6r1" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.694 225859 DEBUG nova.storage.rbd_utils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] rbd image 3426109c-5671-4cc7-89b6-fea13983f921_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.698 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config 3426109c-5671-4cc7-89b6-fea13983f921_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.912 225859 DEBUG nova.compute.manager [req-bbd0046f-0a5e-491d-b067-f825cbababaf req-f6d9b735-c480-48e3-a1a4-205c2f48b0ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.913 225859 DEBUG oslo_concurrency.lockutils [req-bbd0046f-0a5e-491d-b067-f825cbababaf req-f6d9b735-c480-48e3-a1a4-205c2f48b0ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.914 225859 DEBUG oslo_concurrency.lockutils [req-bbd0046f-0a5e-491d-b067-f825cbababaf req-f6d9b735-c480-48e3-a1a4-205c2f48b0ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.914 225859 DEBUG oslo_concurrency.lockutils [req-bbd0046f-0a5e-491d-b067-f825cbababaf req-f6d9b735-c480-48e3-a1a4-205c2f48b0ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:49 np0005588919 nova_compute[225855]: 2026-01-20 14:56:49.915 225859 DEBUG nova.compute.manager [req-bbd0046f-0a5e-491d-b067-f825cbababaf req-f6d9b735-c480-48e3-a1a4-205c2f48b0ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Processing event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.002 225859 DEBUG oslo_concurrency.processutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config 3426109c-5671-4cc7-89b6-fea13983f921_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.305s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.003 225859 INFO nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Deleting local config drive /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921/disk.config because it was imported into RBD.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.020 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921010.0199256, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.021 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.023 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.031 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.040 225859 INFO nova.virt.libvirt.driver [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance spawned successfully.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.041 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:56:50 np0005588919 kernel: tapd93a212a-0f: entered promiscuous mode
Jan 20 09:56:50 np0005588919 NetworkManager[49104]: <info>  [1768921010.0599] manager: (tapd93a212a-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.060 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:50Z|00534|binding|INFO|Claiming lport d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 for this chassis.
Jan 20 09:56:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:50Z|00535|binding|INFO|d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8: Claiming fa:16:3e:4f:e1:78 10.100.0.11
Jan 20 09:56:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:50.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:50 np0005588919 NetworkManager[49104]: <info>  [1768921010.0703] device (tapd93a212a-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:56:50 np0005588919 NetworkManager[49104]: <info>  [1768921010.0709] device (tapd93a212a-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:56:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:50Z|00536|binding|INFO|Setting lport d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 ovn-installed in OVS
Jan 20 09:56:50 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:50Z|00537|binding|INFO|Setting lport d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 up in Southbound
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.078 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:e1:78 10.100.0.11'], port_security=['fa:16:3e:4f:e1:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3426109c-5671-4cc7-89b6-fea13983f921', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '2', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.079 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c bound to our chassis#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.081 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:50 np0005588919 systemd-machined[194361]: New machine qemu-64-instance-00000087.
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.093 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.095 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[deecf02a-a3ed-43e4-ac2d-f0b68d85d550]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.099 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.104 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.104 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.105 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.105 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.105 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.106 225859 DEBUG nova.virt.libvirt.driver [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:50 np0005588919 systemd[1]: Started Virtual Machine qemu-64-instance-00000087.
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.123 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f8df02a3-11ad-4c9f-a8ef-1d628e9c36c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.125 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[661a4755-200c-401e-98cf-f561ccccdc18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.141 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.141 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921010.022909, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.141 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.152 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a99abc-5442-461d-bcfa-ebea22768e5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.169 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2112cb1f-cfc0-4c5a-a465-817c9623b6aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603603, 'reachable_time': 28670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278736, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.185 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a9bc48f2-81e5-43e0-b5a2-70417bdb893b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4c8474b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603613, 'tstamp': 603613}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278737, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4c8474b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603617, 'tstamp': 603617}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278737, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.186 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.188 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.189 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.189 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4c8474b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.189 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.190 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4c8474b-00, col_values=(('external_ids', {'iface-id': '8c6fd3ab-70a8-4e63-99de-f2e15ac0207f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:50.190 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.203 225859 DEBUG nova.network.neutron [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Updated VIF entry in instance network info cache for port d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.204 225859 DEBUG nova.network.neutron [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Updating instance_info_cache with network_info: [{"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.242 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.246 225859 DEBUG oslo_concurrency.lockutils [req-1a5af0a2-50d9-4cd7-9675-e509d73b28f2 req-ecde0102-bb46-4490-a675-b039f3f48622 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3426109c-5671-4cc7-89b6-fea13983f921" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.248 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921010.026502, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.248 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.260 225859 INFO nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Took 7.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:56:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.260 225859 DEBUG nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.271 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.276 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.314 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.339 225859 INFO nova.compute.manager [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Took 8.80 seconds to build instance.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.361 225859 DEBUG oslo_concurrency.lockutils [None req-598146b4-a7b4-497e-8f70-a27771b8bdcf 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.635 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921010.6349812, 3426109c-5671-4cc7-89b6-fea13983f921 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.635 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] VM Started (Lifecycle Event)#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.660 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.665 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921010.6386104, 3426109c-5671-4cc7-89b6-fea13983f921 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.665 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.685 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.688 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.705 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.814 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:50 np0005588919 nova_compute[225855]: 2026-01-20 14:56:50.815 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:56:51 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:51Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:7a:cf 10.100.0.13
Jan 20 09:56:51 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:51Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:7a:cf 10.100.0.13
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.245 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.434 225859 INFO nova.compute.manager [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Rescuing#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.435 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.435 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.435 225859 DEBUG nova.network.neutron [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:56:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:51.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.975 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.976 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.976 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.976 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.976 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.976 225859 WARNING nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.977 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.977 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.977 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.977 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.977 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Processing event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 DEBUG oslo_concurrency.lockutils [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 DEBUG nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] No waiting events found dispatching network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.978 225859 WARNING nova.compute.manager [req-23a1b379-cbf9-4d17-ad19-aeda0ddc83f2 req-50fb98a0-611f-4dc4-9188-371466c7b34f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received unexpected event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.979 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.982 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921011.982382, 3426109c-5671-4cc7-89b6-fea13983f921 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.982 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.984 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.986 225859 INFO nova.virt.libvirt.driver [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Instance spawned successfully.#033[00m
Jan 20 09:56:51 np0005588919 nova_compute[225855]: 2026-01-20 14:56:51.986 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.000 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.005 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.009 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.010 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.010 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.011 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.011 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.011 225859 DEBUG nova.virt.libvirt.driver [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.036 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:56:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:52.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.070 225859 INFO nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Took 8.98 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.071 225859 DEBUG nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.136 225859 INFO nova.compute.manager [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Took 10.56 seconds to build instance.#033[00m
Jan 20 09:56:52 np0005588919 nova_compute[225855]: 2026-01-20 14:56:52.157 225859 DEBUG oslo_concurrency.lockutils [None req-116c4c26-7bd7-4fed-8899-981d8620cba1 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:53 np0005588919 nova_compute[225855]: 2026-01-20 14:56:53.377 225859 DEBUG nova.network.neutron [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:53.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:53 np0005588919 nova_compute[225855]: 2026-01-20 14:56:53.551 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:56:53 np0005588919 nova_compute[225855]: 2026-01-20 14:56:53.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:53 np0005588919 nova_compute[225855]: 2026-01-20 14:56:53.815 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 09:56:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:54.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.480 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.481 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.481 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.481 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.481 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.482 225859 INFO nova.compute.manager [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Terminating instance#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.483 225859 DEBUG nova.compute.manager [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:56:54 np0005588919 kernel: tapd93a212a-0f (unregistering): left promiscuous mode
Jan 20 09:56:54 np0005588919 NetworkManager[49104]: <info>  [1768921014.5254] device (tapd93a212a-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:56:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:54Z|00538|binding|INFO|Releasing lport d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 from this chassis (sb_readonly=0)
Jan 20 09:56:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:54Z|00539|binding|INFO|Setting lport d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 down in Southbound
Jan 20 09:56:54 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:54Z|00540|binding|INFO|Removing iface tapd93a212a-0f ovn-installed in OVS
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.580 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.595 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.596 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:e1:78 10.100.0.11'], port_security=['fa:16:3e:4f:e1:78 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3426109c-5671-4cc7-89b6-fea13983f921', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '4', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.598 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c unbound from our chassis#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.599 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.615 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34f5edcf-42a6-4659-adad-b67b4cb0dae7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 20 09:56:54 np0005588919 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000087.scope: Consumed 3.085s CPU time.
Jan 20 09:56:54 np0005588919 systemd-machined[194361]: Machine qemu-64-instance-00000087 terminated.
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.653 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4b379646-98d7-49b0-b538-fa8d9036fa2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.658 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[86bd8d5a-29d6-411d-b120-d3edfc95c16a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.697 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[703cc8b4-4daf-4128-8717-c0400dbd6bb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.703 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.718 225859 INFO nova.virt.libvirt.driver [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Instance destroyed successfully.#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.718 225859 DEBUG nova.objects.instance [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'resources' on Instance uuid 3426109c-5671-4cc7-89b6-fea13983f921 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.731 225859 DEBUG nova.virt.libvirt.vif [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=135,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-9isslkfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:56:52Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=3426109c-5671-4cc7-89b6-fea13983f921,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.732 225859 DEBUG nova.network.os_vif_util [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "address": "fa:16:3e:4f:e1:78", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd93a212a-0f", "ovs_interfaceid": "d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.732 225859 DEBUG nova.network.os_vif_util [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.733 225859 DEBUG os_vif [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.734 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd93a212a-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.734 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2020a3db-379d-44e9-86a1-4e6fc2b8d28a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4c8474b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:a2:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603603, 'reachable_time': 28670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278849, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.736 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.740 225859 INFO os_vif [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:e1:78,bridge_name='br-int',has_traffic_filtering=True,id=d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd93a212a-0f')#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.753 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f28c9e0c-e527-44a4-a3fc-7c067645bc62]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf4c8474b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603613, 'tstamp': 603613}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278855, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf4c8474b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603617, 'tstamp': 603617}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278855, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.755 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.757 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4c8474b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4c8474b-00, col_values=(('external_ids', {'iface-id': '8c6fd3ab-70a8-4e63-99de-f2e15ac0207f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:54.758 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.759 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.857 225859 DEBUG nova.compute.manager [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-unplugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.857 225859 DEBUG oslo_concurrency.lockutils [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.857 225859 DEBUG oslo_concurrency.lockutils [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.857 225859 DEBUG oslo_concurrency.lockutils [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.858 225859 DEBUG nova.compute.manager [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] No waiting events found dispatching network-vif-unplugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:54 np0005588919 nova_compute[225855]: 2026-01-20 14:56:54.858 225859 DEBUG nova.compute.manager [req-6fe54304-c493-4d66-8e9d-49bcb9e14bf8 req-15db1934-c644-4261-9eae-007b1f0282c0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-unplugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.109 225859 INFO nova.virt.libvirt.driver [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Deleting instance files /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921_del#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.110 225859 INFO nova.virt.libvirt.driver [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Deletion of /var/lib/nova/instances/3426109c-5671-4cc7-89b6-fea13983f921_del complete#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.201 225859 INFO nova.compute.manager [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.202 225859 DEBUG oslo.service.loopingcall [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.202 225859 DEBUG nova.compute.manager [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:56:55 np0005588919 nova_compute[225855]: 2026-01-20 14:56:55.202 225859 DEBUG nova.network.neutron [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:56:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:55.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:56.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.435 225859 DEBUG nova.network.neutron [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.456 225859 INFO nova.compute.manager [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.533 225859 DEBUG nova.compute.manager [req-628444ff-a249-4e4f-b408-f04196cc158a req-4a38def0-1013-4c32-8f4f-308ccf74815b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-deleted-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.630 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.630 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:56 np0005588919 nova_compute[225855]: 2026-01-20 14:56:56.709 225859 DEBUG oslo_concurrency.processutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:56:57 np0005588919 podman[278896]: 2026-01-20 14:56:57.014682637 +0000 UTC m=+0.056524336 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.083 225859 DEBUG nova.compute.manager [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.083 225859 DEBUG oslo_concurrency.lockutils [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3426109c-5671-4cc7-89b6-fea13983f921-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.084 225859 DEBUG oslo_concurrency.lockutils [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.084 225859 DEBUG oslo_concurrency.lockutils [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.084 225859 DEBUG nova.compute.manager [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] No waiting events found dispatching network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.085 225859 WARNING nova.compute.manager [req-0b8f39b7-395f-4d9f-8e90-b631855d6923 req-17352424-488c-4a07-837c-76cfdf4ddd1e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Received unexpected event network-vif-plugged-d93a212a-0f1f-4f7e-9e7e-ee3fd5a542e8 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:56:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:56:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2810887296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.198 225859 DEBUG oslo_concurrency.processutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.204 225859 DEBUG nova.compute.provider_tree [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.386 225859 DEBUG nova.scheduler.client.report [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:56:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:57.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.571 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.771 225859 INFO nova.scheduler.client.report [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Deleted allocations for instance 3426109c-5671-4cc7-89b6-fea13983f921#033[00m
Jan 20 09:56:57 np0005588919 nova_compute[225855]: 2026-01-20 14:56:57.945 225859 DEBUG oslo_concurrency.lockutils [None req-b6008c26-9999-4183-b136-a3c591315ebe 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "3426109c-5671-4cc7-89b6-fea13983f921" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:56:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:56:58.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.444 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.444 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.445 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.445 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.445 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.447 225859 INFO nova.compute.manager [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Terminating instance#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.448 225859 DEBUG nova.compute.manager [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:56:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:56:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:56:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:56:59.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:56:59 np0005588919 kernel: tap5909a21f-c1 (unregistering): left promiscuous mode
Jan 20 09:56:59 np0005588919 NetworkManager[49104]: <info>  [1768921019.4918] device (tap5909a21f-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.501 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:59Z|00541|binding|INFO|Releasing lport 5909a21f-c1fb-4265-a7de-a6b0e6136194 from this chassis (sb_readonly=0)
Jan 20 09:56:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:59Z|00542|binding|INFO|Setting lport 5909a21f-c1fb-4265-a7de-a6b0e6136194 down in Southbound
Jan 20 09:56:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:56:59Z|00543|binding|INFO|Removing iface tap5909a21f-c1 ovn-installed in OVS
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.509 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:7a:cf 10.100.0.13'], port_security=['fa:16:3e:ac:7a:cf 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e32ecf59-145a-4ae9-a91e-288419407cd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca6cd0afe0ab41e3ab36d21a4129f734', 'neutron:revision_number': '4', 'neutron:security_group_ids': '819ea4ae-b994-44d1-9da3-8b0ca609fb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee620e3e-ef7e-4826-b394-b8a89442b353, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5909a21f-c1fb-4265-a7de-a6b0e6136194) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.511 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5909a21f-c1fb-4265-a7de-a6b0e6136194 in datapath f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c unbound from our chassis#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.512 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.513 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[16f70f90-991b-49e2-9401-4ff68d8dffdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.513 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c namespace which is not needed anymore#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Deactivated successfully.
Jan 20 09:56:59 np0005588919 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Consumed 13.577s CPU time.
Jan 20 09:56:59 np0005588919 systemd-machined[194361]: Machine qemu-62-instance-00000084 terminated.
Jan 20 09:56:59 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [NOTICE]   (277932) : haproxy version is 2.8.14-c23fe91
Jan 20 09:56:59 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [NOTICE]   (277932) : path to executable is /usr/sbin/haproxy
Jan 20 09:56:59 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [ALERT]    (277932) : Current worker (277934) exited with code 143 (Terminated)
Jan 20 09:56:59 np0005588919 neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c[277928]: [WARNING]  (277932) : All workers exited. Exiting... (0)
Jan 20 09:56:59 np0005588919 systemd[1]: libpod-998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250.scope: Deactivated successfully.
Jan 20 09:56:59 np0005588919 podman[278943]: 2026-01-20 14:56:59.650726445 +0000 UTC m=+0.047712717 container died 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.668 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.676 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.682 225859 INFO nova.virt.libvirt.driver [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Instance destroyed successfully.#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.683 225859 DEBUG nova.objects.instance [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lazy-loading 'resources' on Instance uuid e32ecf59-145a-4ae9-a91e-288419407cd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:56:59 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250-userdata-shm.mount: Deactivated successfully.
Jan 20 09:56:59 np0005588919 systemd[1]: var-lib-containers-storage-overlay-6beda3a917b01666983f6717d8a15faa248a8e035acb78285162928c0b4a3550-merged.mount: Deactivated successfully.
Jan 20 09:56:59 np0005588919 podman[278943]: 2026-01-20 14:56:59.70688969 +0000 UTC m=+0.103875962 container cleanup 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:56:59 np0005588919 systemd[1]: libpod-conmon-998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250.scope: Deactivated successfully.
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.721 225859 DEBUG nova.virt.libvirt.vif [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1537565903',display_name='tempest-ServersTestJSON-server-1537565903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1537565903',id=132,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ca6cd0afe0ab41e3ab36d21a4129f734',ramdisk_id='',reservation_id='r-iv93ouga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-405461620',owner_user_name='tempest-ServersTestJSON-405461620-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:56:37Z,user_data=None,user_id='395a5c503218411284bc94c45263d1fb',uuid=e32ecf59-145a-4ae9-a91e-288419407cd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.722 225859 DEBUG nova.network.os_vif_util [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converting VIF {"id": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "address": "fa:16:3e:ac:7a:cf", "network": {"id": "f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c", "bridge": "br-int", "label": "tempest-ServersTestJSON-1745321011-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ca6cd0afe0ab41e3ab36d21a4129f734", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5909a21f-c1", "ovs_interfaceid": "5909a21f-c1fb-4265-a7de-a6b0e6136194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.723 225859 DEBUG nova.network.os_vif_util [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.724 225859 DEBUG os_vif [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.727 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5909a21f-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.735 225859 INFO os_vif [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:7a:cf,bridge_name='br-int',has_traffic_filtering=True,id=5909a21f-c1fb-4265-a7de-a6b0e6136194,network=Network(f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5909a21f-c1')#033[00m
Jan 20 09:56:59 np0005588919 podman[278984]: 2026-01-20 14:56:59.784671144 +0000 UTC m=+0.051515124 container remove 998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.790 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c74e6b3e-ea07-4d83-8b4f-aa1fa5f7f1c2]: (4, ('Tue Jan 20 02:56:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c (998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250)\n998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250\nTue Jan 20 02:56:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c (998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250)\n998e434681eead6e0c698e580159f48be23d2464951b6f3b45d2e6309690a250\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.792 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[58f4c533-1526-485c-b8a4-195562bf05d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.793 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4c8474b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:56:59 np0005588919 kernel: tapf4c8474b-00: left promiscuous mode
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.799 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 nova_compute[225855]: 2026-01-20 14:56:59.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.816 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7008a930-fc31-418a-8737-78529d1f56f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.836 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[378a5839-aec1-49f3-b302-7f25ee6dfe6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.839 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8a8987-8605-4711-830a-6c542b3ce0c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.858 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f88e5442-f2ee-4e0d-bbad-ee4bda8b0ece]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603596, 'reachable_time': 19125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279018, 'error': None, 'target': 'ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 systemd[1]: run-netns-ovnmeta\x2df4c8474b\x2d0ca3\x2d4cb0\x2db6dd\x2de6aa302def5c.mount: Deactivated successfully.
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.863 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4c8474b-0ca3-4cb0-b6dd-e6aa302def5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:56:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:56:59.864 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[587979e2-d1dd-4695-84e8-b4c9da892edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:56:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 20 09:57:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:00.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.131 225859 DEBUG nova.compute.manager [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-unplugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.132 225859 DEBUG oslo_concurrency.lockutils [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.132 225859 DEBUG oslo_concurrency.lockutils [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.132 225859 DEBUG oslo_concurrency.lockutils [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.133 225859 DEBUG nova.compute.manager [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] No waiting events found dispatching network-vif-unplugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.133 225859 DEBUG nova.compute.manager [req-c528debc-36e8-4353-a2e3-ea5f450725c1 req-b389f7c4-186a-4efe-847c-d7fd9a937572 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-unplugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.207 225859 INFO nova.virt.libvirt.driver [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Deleting instance files /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0_del#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.208 225859 INFO nova.virt.libvirt.driver [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Deletion of /var/lib/nova/instances/e32ecf59-145a-4ae9-a91e-288419407cd0_del complete#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.273 225859 INFO nova.compute.manager [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Took 0.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.274 225859 DEBUG oslo.service.loopingcall [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.274 225859 DEBUG nova.compute.manager [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.275 225859 DEBUG nova.network.neutron [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:57:00 np0005588919 nova_compute[225855]: 2026-01-20 14:57:00.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:01 np0005588919 nova_compute[225855]: 2026-01-20 14:57:01.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:01.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:02.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.346 225859 DEBUG nova.compute.manager [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.347 225859 DEBUG oslo_concurrency.lockutils [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.348 225859 DEBUG oslo_concurrency.lockutils [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.348 225859 DEBUG oslo_concurrency.lockutils [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.348 225859 DEBUG nova.compute.manager [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] No waiting events found dispatching network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.348 225859 WARNING nova.compute.manager [req-5847965d-8864-4f9b-9262-46aac7f2c6bf req-3fe8b7f3-1b01-4be8-b1f4-0531d247f254 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received unexpected event network-vif-plugged-5909a21f-c1fb-4265-a7de-a6b0e6136194 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:57:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.697 225859 DEBUG nova.network.neutron [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.714 225859 INFO nova.compute.manager [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Took 2.44 seconds to deallocate network for instance.#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.761 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.762 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.806 225859 DEBUG nova.compute.manager [req-27d8ae0c-49bf-418b-adb3-264f32eeae8e req-f1a4ed8a-e3d9-4390-accb-b6d1da175cb5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Received event network-vif-deleted-5909a21f-c1fb-4265-a7de-a6b0e6136194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:02 np0005588919 nova_compute[225855]: 2026-01-20 14:57:02.834 225859 DEBUG oslo_concurrency.processutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/234247054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.290 225859 DEBUG oslo_concurrency.processutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.298 225859 DEBUG nova.compute.provider_tree [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.318 225859 DEBUG nova.scheduler.client.report [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.360 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.386 225859 INFO nova.scheduler.client.report [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Deleted allocations for instance e32ecf59-145a-4ae9-a91e-288419407cd0#033[00m
Jan 20 09:57:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:03.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.478 225859 DEBUG oslo_concurrency.lockutils [None req-71e0153c-f8fc-4e1a-86c2-5791b270d513 395a5c503218411284bc94c45263d1fb ca6cd0afe0ab41e3ab36d21a4129f734 - - default default] Lock "e32ecf59-145a-4ae9-a91e-288419407cd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:03 np0005588919 nova_compute[225855]: 2026-01-20 14:57:03.872 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 09:57:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:04.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:04 np0005588919 nova_compute[225855]: 2026-01-20 14:57:04.728 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:05.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:06.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:06 np0005588919 nova_compute[225855]: 2026-01-20 14:57:06.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:06 np0005588919 kernel: tap6550efe7-72 (unregistering): left promiscuous mode
Jan 20 09:57:06 np0005588919 NetworkManager[49104]: <info>  [1768921026.8414] device (tap6550efe7-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:57:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:06Z|00544|binding|INFO|Releasing lport 6550efe7-7235-437c-b9f3-728b676371ee from this chassis (sb_readonly=0)
Jan 20 09:57:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:06Z|00545|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee down in Southbound
Jan 20 09:57:06 np0005588919 nova_compute[225855]: 2026-01-20 14:57:06.845 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:06 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:06Z|00546|binding|INFO|Removing iface tap6550efe7-72 ovn-installed in OVS
Jan 20 09:57:06 np0005588919 nova_compute[225855]: 2026-01-20 14:57:06.848 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:06.853 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:4f:ce 10.100.0.3'], port_security=['fa:16:3e:e3:4f:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '538fe1f0-b666-4b97-b2ef-317adae0a47a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87caaa2e-d899-4eed-8b6a-8d19125c693b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4e37f4fd7f4dbbb25648ec639e0e43', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16f3c0d4-753e-4c8b-b00a-7073cbcfa6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be68bfdf-b1f2-46c8-82b2-2c275774a706, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6550efe7-7235-437c-b9f3-728b676371ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:06.855 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6550efe7-7235-437c-b9f3-728b676371ee in datapath 87caaa2e-d899-4eed-8b6a-8d19125c693b unbound from our chassis#033[00m
Jan 20 09:57:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:06.856 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 87caaa2e-d899-4eed-8b6a-8d19125c693b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 09:57:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:06.858 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e74ffd80-0325-431f-9630-1e06287569e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:06 np0005588919 nova_compute[225855]: 2026-01-20 14:57:06.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:06 np0005588919 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 20 09:57:06 np0005588919 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000086.scope: Consumed 13.681s CPU time.
Jan 20 09:57:06 np0005588919 systemd-machined[194361]: Machine qemu-63-instance-00000086 terminated.
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.101 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.107 225859 INFO nova.virt.libvirt.driver [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance destroyed successfully.#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.107 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'numa_topology' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.165 225859 DEBUG nova.compute.manager [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.165 225859 DEBUG oslo_concurrency.lockutils [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.165 225859 DEBUG oslo_concurrency.lockutils [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.166 225859 DEBUG oslo_concurrency.lockutils [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.166 225859 DEBUG nova.compute.manager [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.166 225859 WARNING nova.compute.manager [req-1b828cd4-e917-4e7f-a96c-761f07ad436c req-5e39838f-59bd-4fcc-891e-b8ef3d113d04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:57:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:07.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.485 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Attempting rescue#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.486 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.489 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.490 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Creating image(s)#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.515 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.519 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.583 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.608 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.613 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.673 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.675 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.675 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.676 225859 DEBUG oslo_concurrency.lockutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.700 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.704 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.997 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:07 np0005588919 nova_compute[225855]: 2026-01-20 14:57:07.998 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'migration_context' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.013 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.014 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start _get_guest_xml network_info=[{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "vif_mac": "fa:16:3e:e3:4f:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.014 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'resources' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.032 225859 WARNING nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.037 225859 DEBUG nova.virt.libvirt.host [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.038 225859 DEBUG nova.virt.libvirt.host [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.041 225859 DEBUG nova.virt.libvirt.host [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.041 225859 DEBUG nova.virt.libvirt.host [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.042 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.042 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.043 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.043 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.043 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.043 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.044 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.044 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.044 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.044 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.044 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.045 225859 DEBUG nova.virt.hardware [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.045 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.065 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:08.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1167924893' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.511 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.512 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1449155061' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.959 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:08 np0005588919 nova_compute[225855]: 2026-01-20 14:57:08.961 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.327 225859 DEBUG nova.compute.manager [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.328 225859 DEBUG oslo_concurrency.lockutils [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.328 225859 DEBUG oslo_concurrency.lockutils [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.328 225859 DEBUG oslo_concurrency.lockutils [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.329 225859 DEBUG nova.compute.manager [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.329 225859 WARNING nova.compute.manager [req-d0791654-f534-423b-9e22-2cf86c20a94e req-1f6df6af-5915-476e-b99c-149056b6cdc7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 09:57:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4079294048' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.396 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.398 225859 DEBUG nova.virt.libvirt.vif [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-430397789',display_name='tempest-ServerRescueTestJSONUnderV235-server-430397789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-430397789',id=134,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:56:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4e37f4fd7f4dbbb25648ec639e0e43',ramdisk_id='',reservation_id='r-xjy771y2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-201664875',owner_user_name='tempest-ServerRescueTestJSONUnderV235-201664875-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:56:50Z,user_data=None,user_id='168ca7898b964a44b76c90912fa89a66',uuid=538fe1f0-b666-4b97-b2ef-317adae0a47a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "vif_mac": "fa:16:3e:e3:4f:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.398 225859 DEBUG nova.network.os_vif_util [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converting VIF {"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "vif_mac": "fa:16:3e:e3:4f:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.399 225859 DEBUG nova.network.os_vif_util [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.400 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'pci_devices' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.413 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <uuid>538fe1f0-b666-4b97-b2ef-317adae0a47a</uuid>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <name>instance-00000086</name>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-430397789</nova:name>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:57:08</nova:creationTime>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:user uuid="168ca7898b964a44b76c90912fa89a66">tempest-ServerRescueTestJSONUnderV235-201664875-project-member</nova:user>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:project uuid="4d4e37f4fd7f4dbbb25648ec639e0e43">tempest-ServerRescueTestJSONUnderV235-201664875</nova:project>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <nova:port uuid="6550efe7-7235-437c-b9f3-728b676371ee">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="serial">538fe1f0-b666-4b97-b2ef-317adae0a47a</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="uuid">538fe1f0-b666-4b97-b2ef-317adae0a47a</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.rescue">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/538fe1f0-b666-4b97-b2ef-317adae0a47a_disk">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config.rescue">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:e3:4f:ce"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <target dev="tap6550efe7-72"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/console.log" append="off"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:57:09 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:57:09 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:57:09 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:57:09 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.422 225859 INFO nova.virt.libvirt.driver [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance destroyed successfully.#033[00m
Jan 20 09:57:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:09.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.475 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.476 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.476 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.476 225859 DEBUG nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] No VIF found with MAC fa:16:3e:e3:4f:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.476 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Using config drive#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.504 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.521 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.561 225859 DEBUG nova.objects.instance [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'keypairs' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.716 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921014.7143466, 3426109c-5671-4cc7-89b6-fea13983f921 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.717 225859 INFO nova.compute.manager [-] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.732 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:09 np0005588919 nova_compute[225855]: 2026-01-20 14:57:09.737 225859 DEBUG nova.compute.manager [None req-5fa53512-648d-49fd-8780-c95b446ec4b7 - - - - - -] [instance: 3426109c-5671-4cc7-89b6-fea13983f921] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:10.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.212 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Creating config drive at /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.217 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdl7b9i4x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.354 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdl7b9i4x" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.395 225859 DEBUG nova.storage.rbd_utils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] rbd image 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.399 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.721 225859 DEBUG oslo_concurrency.processutils [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue 538fe1f0-b666-4b97-b2ef-317adae0a47a_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.322s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.722 225859 INFO nova.virt.libvirt.driver [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Deleting local config drive /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 09:57:10 np0005588919 kernel: tap6550efe7-72: entered promiscuous mode
Jan 20 09:57:10 np0005588919 NetworkManager[49104]: <info>  [1768921030.7846] manager: (tap6550efe7-72): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 20 09:57:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:10Z|00547|binding|INFO|Claiming lport 6550efe7-7235-437c-b9f3-728b676371ee for this chassis.
Jan 20 09:57:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:10Z|00548|binding|INFO|6550efe7-7235-437c-b9f3-728b676371ee: Claiming fa:16:3e:e3:4f:ce 10.100.0.3
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.786 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:10.794 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:4f:ce 10.100.0.3'], port_security=['fa:16:3e:e3:4f:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '538fe1f0-b666-4b97-b2ef-317adae0a47a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87caaa2e-d899-4eed-8b6a-8d19125c693b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4e37f4fd7f4dbbb25648ec639e0e43', 'neutron:revision_number': '5', 'neutron:security_group_ids': '16f3c0d4-753e-4c8b-b00a-7073cbcfa6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be68bfdf-b1f2-46c8-82b2-2c275774a706, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6550efe7-7235-437c-b9f3-728b676371ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:10.795 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6550efe7-7235-437c-b9f3-728b676371ee in datapath 87caaa2e-d899-4eed-8b6a-8d19125c693b bound to our chassis#033[00m
Jan 20 09:57:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:10.796 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 87caaa2e-d899-4eed-8b6a-8d19125c693b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 09:57:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:10.797 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[32888d90-943a-4fa6-a682-c2451340a758]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:10Z|00549|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee up in Southbound
Jan 20 09:57:10 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:10Z|00550|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee ovn-installed in OVS
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.809 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:10 np0005588919 nova_compute[225855]: 2026-01-20 14:57:10.814 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:10 np0005588919 systemd-udevd[279297]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:57:10 np0005588919 systemd-machined[194361]: New machine qemu-65-instance-00000086.
Jan 20 09:57:10 np0005588919 NetworkManager[49104]: <info>  [1768921030.8361] device (tap6550efe7-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:57:10 np0005588919 NetworkManager[49104]: <info>  [1768921030.8369] device (tap6550efe7-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:57:10 np0005588919 systemd[1]: Started Virtual Machine qemu-65-instance-00000086.
Jan 20 09:57:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.258 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 538fe1f0-b666-4b97-b2ef-317adae0a47a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.258 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921031.2578287, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.259 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.263 225859 DEBUG nova.compute.manager [None req-7ceaa51e-5ae5-4572-8a7a-d5b83daa8d3a 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.298 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.301 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.352 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.353 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921031.258838, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.353 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Started (Lifecycle Event)#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.374 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.378 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.450 225859 DEBUG nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.450 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.451 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.451 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.451 225859 DEBUG nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.451 225859 WARNING nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state rescued and task_state None.#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.452 225859 DEBUG nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.452 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.452 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.453 225859 DEBUG oslo_concurrency.lockutils [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.453 225859 DEBUG nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:11 np0005588919 nova_compute[225855]: 2026-01-20 14:57:11.453 225859 WARNING nova.compute.manager [req-841e0443-306c-46ab-9816-902cb5dc85c4 req-8377e3d0-3687-417d-b697-c2226b0e2fbd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state rescued and task_state None.#033[00m
Jan 20 09:57:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:11.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:12.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:13 np0005588919 nova_compute[225855]: 2026-01-20 14:57:13.310 225859 DEBUG nova.compute.manager [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-changed-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:13 np0005588919 nova_compute[225855]: 2026-01-20 14:57:13.310 225859 DEBUG nova.compute.manager [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing instance network info cache due to event network-changed-6550efe7-7235-437c-b9f3-728b676371ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:13 np0005588919 nova_compute[225855]: 2026-01-20 14:57:13.310 225859 DEBUG oslo_concurrency.lockutils [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:13 np0005588919 nova_compute[225855]: 2026-01-20 14:57:13.311 225859 DEBUG oslo_concurrency.lockutils [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:13 np0005588919 nova_compute[225855]: 2026-01-20 14:57:13.311 225859 DEBUG nova.network.neutron [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:13.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:57:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1115444861' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:57:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:57:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1115444861' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:57:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:14.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.609 225859 DEBUG nova.compute.manager [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-changed-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.609 225859 DEBUG nova.compute.manager [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing instance network info cache due to event network-changed-6550efe7-7235-437c-b9f3-728b676371ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.610 225859 DEBUG oslo_concurrency.lockutils [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.681 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921019.6797974, e32ecf59-145a-4ae9-a91e-288419407cd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.681 225859 INFO nova.compute.manager [-] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.700 225859 DEBUG nova.compute.manager [None req-e2df0b17-f31b-430c-9f6f-ad4fd589e421 - - - - - -] [instance: e32ecf59-145a-4ae9-a91e-288419407cd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:14 np0005588919 nova_compute[225855]: 2026-01-20 14:57:14.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:15.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:16 np0005588919 podman[279419]: 2026-01-20 14:57:16.048163115 +0000 UTC m=+0.096572365 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 20 09:57:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:16.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.190 225859 DEBUG nova.network.neutron [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updated VIF entry in instance network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.191 225859 DEBUG nova.network.neutron [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.204 225859 DEBUG oslo_concurrency.lockutils [req-a1088e4c-91c6-4fec-87ff-b6e6b2a182a1 req-9d56062e-3a50-4a93-808d-6c531ff0647b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.205 225859 DEBUG oslo_concurrency.lockutils [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.205 225859 DEBUG nova.network.neutron [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:16 np0005588919 nova_compute[225855]: 2026-01-20 14:57:16.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:16.416 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:16.417 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:16.417 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:17.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:18.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 20 09:57:18 np0005588919 nova_compute[225855]: 2026-01-20 14:57:18.443 225859 DEBUG nova.network.neutron [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updated VIF entry in instance network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:18 np0005588919 nova_compute[225855]: 2026-01-20 14:57:18.443 225859 DEBUG nova.network.neutron [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:18 np0005588919 nova_compute[225855]: 2026-01-20 14:57:18.466 225859 DEBUG oslo_concurrency.lockutils [req-1bbe517c-a35b-4cba-ab38-ff961cf7497c req-2f512ad2-cbe5-418e-b8fd-28913ee6540f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:19.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:19 np0005588919 NetworkManager[49104]: <info>  [1768921039.4920] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 20 09:57:19 np0005588919 NetworkManager[49104]: <info>  [1768921039.4931] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 20 09:57:19 np0005588919 nova_compute[225855]: 2026-01-20 14:57:19.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:19 np0005588919 nova_compute[225855]: 2026-01-20 14:57:19.692 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:19 np0005588919 nova_compute[225855]: 2026-01-20 14:57:19.715 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:19 np0005588919 nova_compute[225855]: 2026-01-20 14:57:19.735 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 20 09:57:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:20.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.251 225859 DEBUG nova.compute.manager [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-changed-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.251 225859 DEBUG nova.compute.manager [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing instance network info cache due to event network-changed-6550efe7-7235-437c-b9f3-728b676371ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.251 225859 DEBUG oslo_concurrency.lockutils [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.252 225859 DEBUG oslo_concurrency.lockutils [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.252 225859 DEBUG nova.network.neutron [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:21 np0005588919 nova_compute[225855]: 2026-01-20 14:57:21.257 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:21.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:22.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:57:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 43K writes, 171K keys, 43K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.05 MB/s#012Cumulative WAL: 43K writes, 15K syncs, 2.75 writes per sync, written: 0.16 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 46.56 MB, 0.08 MB/s#012Interval WAL: 12K writes, 4997 syncs, 2.47 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 09:57:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:23 np0005588919 nova_compute[225855]: 2026-01-20 14:57:23.214 225859 DEBUG nova.network.neutron [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updated VIF entry in instance network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:23 np0005588919 nova_compute[225855]: 2026-01-20 14:57:23.214 225859 DEBUG nova.network.neutron [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:23 np0005588919 nova_compute[225855]: 2026-01-20 14:57:23.241 225859 DEBUG oslo_concurrency.lockutils [req-87b9a3a5-4e98-4201-9755-e486bd6571dc req-c541d0da-fda3-4442-a775-ba4641e234f0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:23.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:24.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:24 np0005588919 nova_compute[225855]: 2026-01-20 14:57:24.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:25.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:25 np0005588919 nova_compute[225855]: 2026-01-20 14:57:25.500 225859 DEBUG nova.compute.manager [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-changed-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:25 np0005588919 nova_compute[225855]: 2026-01-20 14:57:25.500 225859 DEBUG nova.compute.manager [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing instance network info cache due to event network-changed-6550efe7-7235-437c-b9f3-728b676371ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:25 np0005588919 nova_compute[225855]: 2026-01-20 14:57:25.500 225859 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:25 np0005588919 nova_compute[225855]: 2026-01-20 14:57:25.501 225859 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:25 np0005588919 nova_compute[225855]: 2026-01-20 14:57:25.501 225859 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Refreshing network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:26.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:26 np0005588919 nova_compute[225855]: 2026-01-20 14:57:26.258 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:28 np0005588919 podman[279454]: 2026-01-20 14:57:28.083031161 +0000 UTC m=+0.110338374 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 09:57:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:28.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:28 np0005588919 nova_compute[225855]: 2026-01-20 14:57:28.412 225859 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updated VIF entry in instance network info cache for port 6550efe7-7235-437c-b9f3-728b676371ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:28 np0005588919 nova_compute[225855]: 2026-01-20 14:57:28.413 225859 DEBUG nova.network.neutron [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [{"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:28 np0005588919 nova_compute[225855]: 2026-01-20 14:57:28.456 225859 DEBUG oslo_concurrency.lockutils [req-f4718a15-cde7-4e38-ae03-2cfac926ba5b req-a94a2c68-9914-47fb-89ae-617fc949dc96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-538fe1f0-b666-4b97-b2ef-317adae0a47a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:28 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #51. Immutable memtables: 0.
Jan 20 09:57:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.738 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.758 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.759 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.759 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.759 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.759 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.760 225859 INFO nova.compute.manager [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Terminating instance#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.761 225859 DEBUG nova.compute.manager [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:57:29 np0005588919 kernel: tap6550efe7-72 (unregistering): left promiscuous mode
Jan 20 09:57:29 np0005588919 NetworkManager[49104]: <info>  [1768921049.8282] device (tap6550efe7-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.874 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:29Z|00551|binding|INFO|Releasing lport 6550efe7-7235-437c-b9f3-728b676371ee from this chassis (sb_readonly=0)
Jan 20 09:57:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:29Z|00552|binding|INFO|Setting lport 6550efe7-7235-437c-b9f3-728b676371ee down in Southbound
Jan 20 09:57:29 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:29Z|00553|binding|INFO|Removing iface tap6550efe7-72 ovn-installed in OVS
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:29.881 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:4f:ce 10.100.0.3'], port_security=['fa:16:3e:e3:4f:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '538fe1f0-b666-4b97-b2ef-317adae0a47a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87caaa2e-d899-4eed-8b6a-8d19125c693b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d4e37f4fd7f4dbbb25648ec639e0e43', 'neutron:revision_number': '8', 'neutron:security_group_ids': '16f3c0d4-753e-4c8b-b00a-7073cbcfa6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be68bfdf-b1f2-46c8-82b2-2c275774a706, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6550efe7-7235-437c-b9f3-728b676371ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:29.883 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6550efe7-7235-437c-b9f3-728b676371ee in datapath 87caaa2e-d899-4eed-8b6a-8d19125c693b unbound from our chassis#033[00m
Jan 20 09:57:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:29.883 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 87caaa2e-d899-4eed-8b6a-8d19125c693b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 09:57:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:29.884 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b99a7f5-be60-487e-b2a6-de3cf20ee086]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:29 np0005588919 nova_compute[225855]: 2026-01-20 14:57:29.893 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:29 np0005588919 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 20 09:57:29 np0005588919 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Consumed 13.460s CPU time.
Jan 20 09:57:29 np0005588919 systemd-machined[194361]: Machine qemu-65-instance-00000086 terminated.
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.005 225859 INFO nova.virt.libvirt.driver [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Instance destroyed successfully.#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.006 225859 DEBUG nova.objects.instance [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lazy-loading 'resources' on Instance uuid 538fe1f0-b666-4b97-b2ef-317adae0a47a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.016 225859 DEBUG nova.virt.libvirt.vif [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-430397789',display_name='tempest-ServerRescueTestJSONUnderV235-server-430397789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-430397789',id=134,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:57:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d4e37f4fd7f4dbbb25648ec639e0e43',ramdisk_id='',reservation_id='r-xjy771y2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-201664875',owner_user_name='tempest-ServerRescueTestJSONUnderV235-201664875-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:11Z,user_data=None,user_id='168ca7898b964a44b76c90912fa89a66',uuid=538fe1f0-b666-4b97-b2ef-317adae0a47a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.017 225859 DEBUG nova.network.os_vif_util [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converting VIF {"id": "6550efe7-7235-437c-b9f3-728b676371ee", "address": "fa:16:3e:e3:4f:ce", "network": {"id": "87caaa2e-d899-4eed-8b6a-8d19125c693b", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-347508957-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4d4e37f4fd7f4dbbb25648ec639e0e43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6550efe7-72", "ovs_interfaceid": "6550efe7-7235-437c-b9f3-728b676371ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.017 225859 DEBUG nova.network.os_vif_util [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.018 225859 DEBUG os_vif [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.021 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6550efe7-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.023 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.027 225859 INFO os_vif [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:4f:ce,bridge_name='br-int',has_traffic_filtering=True,id=6550efe7-7235-437c-b9f3-728b676371ee,network=Network(87caaa2e-d899-4eed-8b6a-8d19125c693b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6550efe7-72')#033[00m
Jan 20 09:57:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2654572084' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:30.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.377 225859 DEBUG nova.compute.manager [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.378 225859 DEBUG oslo_concurrency.lockutils [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.378 225859 DEBUG oslo_concurrency.lockutils [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.379 225859 DEBUG oslo_concurrency.lockutils [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.379 225859 DEBUG nova.compute.manager [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:30 np0005588919 nova_compute[225855]: 2026-01-20 14:57:30.379 225859 DEBUG nova.compute.manager [req-a9e68a66-8cd2-4b84-945f-3935fada19a2 req-4b090397-1a12-4fc5-a7da-f2e42b79bde1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-unplugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.005 225859 INFO nova.virt.libvirt.driver [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Deleting instance files /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a_del#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.006 225859 INFO nova.virt.libvirt.driver [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Deletion of /var/lib/nova/instances/538fe1f0-b666-4b97-b2ef-317adae0a47a_del complete#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.073 225859 INFO nova.compute.manager [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Took 1.31 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.074 225859 DEBUG oslo.service.loopingcall [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.075 225859 DEBUG nova.compute.manager [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.075 225859 DEBUG nova.network.neutron [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:57:31 np0005588919 nova_compute[225855]: 2026-01-20 14:57:31.261 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:31.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:32.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.381 225859 DEBUG nova.network.neutron [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.403 225859 INFO nova.compute.manager [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.448 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.448 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.476 225859 DEBUG nova.compute.manager [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.476 225859 DEBUG oslo_concurrency.lockutils [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.476 225859 DEBUG oslo_concurrency.lockutils [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.477 225859 DEBUG oslo_concurrency.lockutils [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.477 225859 DEBUG nova.compute.manager [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] No waiting events found dispatching network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.477 225859 WARNING nova.compute.manager [req-c69a41d3-e66d-473b-96f6-c004ee7ad447 req-4b83c1a8-f345-4a8c-a0af-12a187c28569 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received unexpected event network-vif-plugged-6550efe7-7235-437c-b9f3-728b676371ee for instance with vm_state deleted and task_state None.#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.496 225859 DEBUG oslo_concurrency.processutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1816590284' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.953 225859 DEBUG oslo_concurrency.processutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.959 225859 DEBUG nova.compute.provider_tree [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:32 np0005588919 nova_compute[225855]: 2026-01-20 14:57:32.981 225859 DEBUG nova.scheduler.client.report [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:33 np0005588919 nova_compute[225855]: 2026-01-20 14:57:33.023 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:33 np0005588919 nova_compute[225855]: 2026-01-20 14:57:33.058 225859 INFO nova.scheduler.client.report [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Deleted allocations for instance 538fe1f0-b666-4b97-b2ef-317adae0a47a#033[00m
Jan 20 09:57:33 np0005588919 nova_compute[225855]: 2026-01-20 14:57:33.180 225859 DEBUG oslo_concurrency.lockutils [None req-c0990b97-4a4d-4801-b9ca-ab15bce827dc 168ca7898b964a44b76c90912fa89a66 4d4e37f4fd7f4dbbb25648ec639e0e43 - - default default] Lock "538fe1f0-b666-4b97-b2ef-317adae0a47a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 20 09:57:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:33.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:33 np0005588919 nova_compute[225855]: 2026-01-20 14:57:33.908 225859 DEBUG nova.compute.manager [req-559b044e-78df-473b-85df-b624d1fd284a req-12db5d70-896f-4bcc-9a31-8a8e532fc4e6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Received event network-vif-deleted-6550efe7-7235-437c-b9f3-728b676371ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:34.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:35 np0005588919 nova_compute[225855]: 2026-01-20 14:57:35.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/759606216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:36.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:36 np0005588919 nova_compute[225855]: 2026-01-20 14:57:36.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:36 np0005588919 nova_compute[225855]: 2026-01-20 14:57:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.339 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.339 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.360 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.442 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.442 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.448 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.449 225859 INFO nova.compute.claims [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:57:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:37.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:37.538 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:37.539 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.541 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:37 np0005588919 nova_compute[225855]: 2026-01-20 14:57:37.600 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 20 09:57:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4005775922' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.053 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.061 225859 DEBUG nova.compute.provider_tree [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.082 225859 DEBUG nova.scheduler.client.report [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.106 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.107 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:57:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:38.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.155 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.155 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.173 225859 INFO nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.200 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.247 225859 INFO nova.virt.block_device [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Booting with volume 5728e8f8-a711-41d5-aa04-a1d9faada8d9 at /dev/vda#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.367 225859 DEBUG nova.policy [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ed2c9bd268d1491fa3484d86bcdb9ec6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.440 225859 DEBUG os_brick.utils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.442 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.453 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.454 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[88dcf51e-1c54-4265-ba65-c0c0bf856399]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.455 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.463 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.463 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[c67e2fa4-47b1-49a0-920e-824fb90f414f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.464 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.473 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.473 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[42682ae7-2e00-4760-a46e-11643b0a2530]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.475 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[3387c024-4eeb-491d-bf3a-f594e7dc4a13]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.475 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.499 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.502 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.502 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.502 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.503 225859 DEBUG os_brick.utils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (61ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:57:38 np0005588919 nova_compute[225855]: 2026-01-20 14:57:38.503 225859 DEBUG nova.virt.block_device [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating existing volume attachment record: 25d46c6d-0955-42e9-9edd-2c90ded91a6c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:57:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:39.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.546 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.548 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.548 225859 INFO nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Creating image(s)#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.549 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.549 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Ensure instance console log exists: /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.549 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.550 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.550 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:39 np0005588919 nova_compute[225855]: 2026-01-20 14:57:39.883 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Successfully created port: 3067803c-07f3-4a15-a5ee-47f9a770efca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:39.991235) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921059991392, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 1600, "num_deletes": 256, "total_data_size": 3286196, "memory_usage": 3359312, "flush_reason": "Manual Compaction"}
Jan 20 09:57:39 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060012444, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 2143187, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 50922, "largest_seqno": 52517, "table_properties": {"data_size": 2136383, "index_size": 3811, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15753, "raw_average_key_size": 20, "raw_value_size": 2122188, "raw_average_value_size": 2822, "num_data_blocks": 166, "num_entries": 752, "num_filter_entries": 752, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768920955, "oldest_key_time": 1768920955, "file_creation_time": 1768921059, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 21276 microseconds, and 5882 cpu microseconds.
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.012536) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 2143187 bytes OK
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.012570) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.015460) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.015479) EVENT_LOG_v1 {"time_micros": 1768921060015474, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.015505) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 3278640, prev total WAL file size 3278640, number of live WAL files 2.
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.016668) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(2092KB)], [99(9849KB)]
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060016707, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 12229236, "oldest_snapshot_seqno": -1}
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.026 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:40.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7779 keys, 10351315 bytes, temperature: kUnknown
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060128189, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 10351315, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10300487, "index_size": 30300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19461, "raw_key_size": 201302, "raw_average_key_size": 25, "raw_value_size": 10162771, "raw_average_value_size": 1306, "num_data_blocks": 1188, "num_entries": 7779, "num_filter_entries": 7779, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921060, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.128401) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 10351315 bytes
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.129916) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.6 rd, 92.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.6 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(10.5) write-amplify(4.8) OK, records in: 8312, records dropped: 533 output_compression: NoCompression
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.129933) EVENT_LOG_v1 {"time_micros": 1768921060129924, "job": 62, "event": "compaction_finished", "compaction_time_micros": 111541, "compaction_time_cpu_micros": 37984, "output_level": 6, "num_output_files": 1, "total_output_size": 10351315, "num_input_records": 8312, "num_output_records": 7779, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060130432, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921060132232, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.016552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.132314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.132322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.132325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.132328) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-14:57:40.132331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.612 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.612 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.632 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.718 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.724 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.724 225859 INFO nova.compute.claims [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.931 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:40 np0005588919 nova_compute[225855]: 2026-01-20 14:57:40.992 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Successfully updated port: 3067803c-07f3-4a15-a5ee-47f9a770efca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.009 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.010 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.010 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.106 225859 DEBUG nova.compute.manager [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.107 225859 DEBUG nova.compute.manager [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.108 225859 DEBUG oslo_concurrency.lockutils [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.157 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.202 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:57:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/721238381' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.411 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.420 225859 DEBUG nova.compute.provider_tree [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.442 225859 DEBUG nova.scheduler.client.report [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.466 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.468 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 09:57:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:41.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.522 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.523 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 09:57:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:41.541 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.547 225859 INFO nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.569 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.654 225859 INFO nova.virt.block_device [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Booting with volume 002e39e3-1bec-4033-aca2-f1428e495087 at /dev/vda#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.884 225859 DEBUG os_brick.utils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.886 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.908 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.909 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[a566225c-38dd-4ec3-b98a-48e62eface13]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.911 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.927 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.927 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d4938f-eeda-4ae2-903c-92eec5a25b5d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.929 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.943 225859 DEBUG nova.policy [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ed2c9bd268d1491fa3484d86bcdb9ec6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.942 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.943 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[653a7e03-824d-4060-80da-dea418d08767]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.947 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[52e22383-c219-4a16-bb5e-bd7a8b742f9a]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.948 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.974 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.977 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.977 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.977 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.978 225859 DEBUG os_brick.utils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (92ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:57:41 np0005588919 nova_compute[225855]: 2026-01-20 14:57:41.978 225859 DEBUG nova.virt.block_device [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating existing volume attachment record: b8b8cc31-54c0-4f4d-80cc-6fca4e9cae9f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:57:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:42.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.377 225859 DEBUG nova.network.neutron [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.398 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.399 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Instance network_info: |[{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.399 225859 DEBUG oslo_concurrency.lockutils [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.400 225859 DEBUG nova.network.neutron [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.406 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Start _get_guest_xml network_info=[{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-5728e8f8-a711-41d5-aa04-a1d9faada8d9', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '5728e8f8-a711-41d5-aa04-a1d9faada8d9', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1ebdefed-0903-4d72-b78d-912666c5ce61', 'attached_at': '', 'detached_at': '', 'volume_id': '5728e8f8-a711-41d5-aa04-a1d9faada8d9', 'serial': '5728e8f8-a711-41d5-aa04-a1d9faada8d9'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '25d46c6d-0955-42e9-9edd-2c90ded91a6c', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.412 225859 WARNING nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.419 225859 DEBUG nova.virt.libvirt.host [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.420 225859 DEBUG nova.virt.libvirt.host [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.432 225859 DEBUG nova.virt.libvirt.host [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.433 225859 DEBUG nova.virt.libvirt.host [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.435 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.436 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.436 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.437 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.437 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.438 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.438 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.439 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.439 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.440 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.440 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.441 225859 DEBUG nova.virt.hardware [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.484 225859 DEBUG nova.storage.rbd_utils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image 1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.491 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e320 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/663477354' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2004004617' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:42 np0005588919 nova_compute[225855]: 2026-01-20 14:57:42.996 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.046 225859 DEBUG nova.virt.libvirt.vif [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1983668831',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1983668831',id=139,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-c4vqjrp4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:38Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=1ebdefed-0903-4d72-b78d-912666c5ce61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.047 225859 DEBUG nova.network.os_vif_util [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.048 225859 DEBUG nova.network.os_vif_util [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.051 225859 DEBUG nova.objects.instance [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.073 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <uuid>1ebdefed-0903-4d72-b78d-912666c5ce61</uuid>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <name>instance-0000008b</name>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-1983668831</nova:name>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:57:42</nova:creationTime>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:user uuid="ed2c9bd268d1491fa3484d86bcdb9ec6">tempest-TestInstancesWithCinderVolumes-1174033615-project-member</nova:user>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:project uuid="107c1f3b5b7b413d9a389ca1166e331f">tempest-TestInstancesWithCinderVolumes-1174033615</nova:project>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <nova:port uuid="3067803c-07f3-4a15-a5ee-47f9a770efca">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="serial">1ebdefed-0903-4d72-b78d-912666c5ce61</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="uuid">1ebdefed-0903-4d72-b78d-912666c5ce61</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-5728e8f8-a711-41d5-aa04-a1d9faada8d9">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <serial>5728e8f8-a711-41d5-aa04-a1d9faada8d9</serial>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:cd:b7:b1"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <target dev="tap3067803c-07"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/console.log" append="off"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:57:43 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:57:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:57:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:57:43 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.075 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Preparing to wait for external event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.075 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.076 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.076 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.076 225859 DEBUG nova.virt.libvirt.vif [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1983668831',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1983668831',id=139,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-c4vqjrp4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:38Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=1ebdefed-0903-4d72-b78d-912666c5ce61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.077 225859 DEBUG nova.network.os_vif_util [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.077 225859 DEBUG nova.network.os_vif_util [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.077 225859 DEBUG os_vif [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.078 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.079 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.082 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3067803c-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.082 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3067803c-07, col_values=(('external_ids', {'iface-id': '3067803c-07f3-4a15-a5ee-47f9a770efca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:b7:b1', 'vm-uuid': '1ebdefed-0903-4d72-b78d-912666c5ce61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588919 NetworkManager[49104]: <info>  [1768921063.0854] manager: (tap3067803c-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.086 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.091 225859 INFO os_vif [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07')#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.211 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.211 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.211 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:cd:b7:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.212 225859 INFO nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Using config drive#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.236 225859 DEBUG nova.storage.rbd_utils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image 1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.272 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Successfully created port: 7c572239-9b2e-493c-8be5-632f27cc634a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.440 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.441 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.441 225859 INFO nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Creating image(s)#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.442 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.442 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Ensure instance console log exists: /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.442 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.442 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:43 np0005588919 nova_compute[225855]: 2026-01-20 14:57:43.443 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:43.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.084 225859 INFO nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Creating config drive at /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.088 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpperrk6d_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:44.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.231 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpperrk6d_" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.310 225859 DEBUG nova.storage.rbd_utils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image 1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.315 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config 1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.436 225859 DEBUG nova.network.neutron [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.437 225859 DEBUG nova.network.neutron [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.497 225859 DEBUG oslo_concurrency.processutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config 1ebdefed-0903-4d72-b78d-912666c5ce61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.498 225859 INFO nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Deleting local config drive /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61/disk.config because it was imported into RBD.#033[00m
Jan 20 09:57:44 np0005588919 kernel: tap3067803c-07: entered promiscuous mode
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.5510] manager: (tap3067803c-07): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.550 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:44Z|00554|binding|INFO|Claiming lport 3067803c-07f3-4a15-a5ee-47f9a770efca for this chassis.
Jan 20 09:57:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:44Z|00555|binding|INFO|3067803c-07f3-4a15-a5ee-47f9a770efca: Claiming fa:16:3e:cd:b7:b1 10.100.0.10
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.556 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.569 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:b7:b1 10.100.0.10'], port_security=['fa:16:3e:cd:b7:b1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1ebdefed-0903-4d72-b78d-912666c5ce61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=3067803c-07f3-4a15-a5ee-47f9a770efca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.571 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 3067803c-07f3-4a15-a5ee-47f9a770efca in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 bound to our chassis#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.573 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d966e1-4d26-414a-920e-0be2d77abb59#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.580 225859 DEBUG oslo_concurrency.lockutils [req-14f5a19f-bf01-4364-9068-6d17f04bf8d0 req-12031be3-e681-499f-bd66-2c01dbc0736c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:44 np0005588919 systemd-machined[194361]: New machine qemu-66-instance-0000008b.
Jan 20 09:57:44 np0005588919 systemd-udevd[280064]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.587 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[85bc5411-7e3f-4c88-99dd-8aa8d9f6653e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.588 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58d966e1-41 in ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.590 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58d966e1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.590 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ceba51b4-66bf-41fb-9d73-f42e4847bd41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.591 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f160bf01-6a7a-40b8-80b9-b469a4c872e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.603 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[a614bb58-4944-4cce-82c3-173d607df94b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.6057] device (tap3067803c-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.6067] device (tap3067803c-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:57:44 np0005588919 systemd[1]: Started Virtual Machine qemu-66-instance-0000008b.
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.631 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1e519688-760a-49fc-bc55-7a3802aff371]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:44Z|00556|binding|INFO|Setting lport 3067803c-07f3-4a15-a5ee-47f9a770efca ovn-installed in OVS
Jan 20 09:57:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:44Z|00557|binding|INFO|Setting lport 3067803c-07f3-4a15-a5ee-47f9a770efca up in Southbound
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.648 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.677 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[22763f14-d4c6-4449-a227-19a5819441f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.682 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7b28a384-cde4-4d31-bfdb-f1a3339ae807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.6832] manager: (tap58d966e1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.713 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8f430bad-abed-419c-92d1-8a0dede51621]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.716 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c393bf1d-b34e-4bf8-9de6-52dc967814fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.7372] device (tap58d966e1-40): carrier: link connected
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.741 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[61d0a49c-5d16-4a65-bce8-db9f09e202c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.761 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2839f9-28f1-4092-8bbc-7bf652f749b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610366, 'reachable_time': 35351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280096, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.776 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2b4e4a-c123-4cf8-bfb4-713feb6061f3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:c82a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610366, 'tstamp': 610366}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280097, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.796 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcc0e89-b368-441f-afce-811d45de02a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610366, 'reachable_time': 35351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280105, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.827 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2647c2a4-5ba8-4194-a29d-2a9153abf2e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.897 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e56a2aa4-9007-4a60-97c0-98ce7639d002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.899 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.899 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.899 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d966e1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 NetworkManager[49104]: <info>  [1768921064.9562] manager: (tap58d966e1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 20 09:57:44 np0005588919 kernel: tap58d966e1-40: entered promiscuous mode
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.959 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.961 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d966e1-40, col_values=(('external_ids', {'iface-id': '1623097d-35b0-4d71-9dc2-c4d659492102'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.963 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:44Z|00558|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.964 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.964 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.965 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cf55f3-66ca-4efa-b667-5931b46097a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.966 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-58d966e1-4d26-414a-920e-0be2d77abb59
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/58d966e1-4d26-414a-920e-0be2d77abb59.pid.haproxy
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 58d966e1-4d26-414a-920e-0be2d77abb59
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 09:57:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:44.967 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'env', 'PROCESS_TAG=haproxy-58d966e1-4d26-414a-920e-0be2d77abb59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58d966e1-4d26-414a-920e-0be2d77abb59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 09:57:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 20 09:57:44 np0005588919 nova_compute[225855]: 2026-01-20 14:57:44.980 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.004 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921050.003141, 538fe1f0-b666-4b97-b2ef-317adae0a47a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.004 225859 INFO nova.compute.manager [-] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] VM Stopped (Lifecycle Event)#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.020 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921065.01918, 1ebdefed-0903-4d72-b78d-912666c5ce61 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.020 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] VM Started (Lifecycle Event)#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.034 225859 DEBUG nova.compute.manager [None req-f3c6c40c-0f7f-433b-bf59-38cded4bc938 - - - - - -] [instance: 538fe1f0-b666-4b97-b2ef-317adae0a47a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.054 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.059 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921065.019513, 1ebdefed-0903-4d72-b78d-912666c5ce61 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.059 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.076 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.080 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.098 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.335 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Successfully updated port: 7c572239-9b2e-493c-8be5-632f27cc634a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.338 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.347 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.347 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.347 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 09:57:45 np0005588919 podman[280172]: 2026-01-20 14:57:45.381598484 +0000 UTC m=+0.060356254 container create f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 09:57:45 np0005588919 systemd[1]: Started libpod-conmon-f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc.scope.
Jan 20 09:57:45 np0005588919 podman[280172]: 2026-01-20 14:57:45.353161812 +0000 UTC m=+0.031919602 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 09:57:45 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:57:45 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad0bfac44a2e71e205ef5911174c9794e1609c1289dfb400ba4190926919b056/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 09:57:45 np0005588919 podman[280172]: 2026-01-20 14:57:45.471944653 +0000 UTC m=+0.150702513 container init f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 09:57:45 np0005588919 podman[280172]: 2026-01-20 14:57:45.47893925 +0000 UTC m=+0.157697060 container start f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:57:45 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [NOTICE]   (280192) : New worker (280194) forked
Jan 20 09:57:45 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [NOTICE]   (280192) : Loading success.
Jan 20 09:57:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:45.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.624 225859 DEBUG nova.compute.manager [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.625 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.626 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.627 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.627 225859 DEBUG nova.compute.manager [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Processing event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.628 225859 DEBUG nova.compute.manager [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.628 225859 DEBUG nova.compute.manager [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.629 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.630 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.635 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921065.6355221, 1ebdefed-0903-4d72-b78d-912666c5ce61 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.636 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.639 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.641 225859 INFO nova.virt.libvirt.driver [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Instance spawned successfully.#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.642 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.657 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.665 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.669 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.669 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.670 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.670 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.670 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.671 225859 DEBUG nova.virt.libvirt.driver [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.705 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.739 225859 INFO nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Took 6.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.740 225859 DEBUG nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.817 225859 INFO nova.compute.manager [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Took 8.39 seconds to build instance.#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.852 225859 DEBUG oslo_concurrency.lockutils [None req-6ee258f6-71b3-490f-ac4f-856603e106f3 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:45 np0005588919 nova_compute[225855]: 2026-01-20 14:57:45.879 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 09:57:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:46.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.268 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.935 225859 DEBUG nova.network.neutron [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.974 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.975 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Instance network_info: |[{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.975 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.976 225859 DEBUG nova.network.neutron [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.979 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Start _get_guest_xml network_info=[{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-002e39e3-1bec-4033-aca2-f1428e495087', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '002e39e3-1bec-4033-aca2-f1428e495087', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b4c1468d-9914-426a-9464-c1167de53632', 'attached_at': '', 'detached_at': '', 'volume_id': '002e39e3-1bec-4033-aca2-f1428e495087', 'serial': '002e39e3-1bec-4033-aca2-f1428e495087'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'b8b8cc31-54c0-4f4d-80cc-6fca4e9cae9f', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.984 225859 WARNING nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.989 225859 DEBUG nova.virt.libvirt.host [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 09:57:46 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.990 225859 DEBUG nova.virt.libvirt.host [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.992 225859 DEBUG nova.virt.libvirt.host [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.992 225859 DEBUG nova.virt.libvirt.host [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.993 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.993 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.994 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.994 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.994 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.994 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.994 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.995 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.995 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.995 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.995 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:46.995 225859 DEBUG nova.virt.hardware [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.023 225859 DEBUG nova.storage.rbd_utils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image b4c1468d-9914-426a-9464-c1167de53632_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.027 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:47 np0005588919 podman[280204]: 2026-01-20 14:57:47.05417519 +0000 UTC m=+0.086643436 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.385 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.385 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.386 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.386 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.386 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2911748478' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.516 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:47.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.695 225859 DEBUG nova.virt.libvirt.vif [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-65714861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-65714861',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-8l7rw241',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:41Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=b4c1468d-9914-426a-9464-c1167de53632,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.696 225859 DEBUG nova.network.os_vif_util [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.697 225859 DEBUG nova.network.os_vif_util [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.698 225859 DEBUG nova.objects.instance [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'pci_devices' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.751 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] End _get_guest_xml xml=<domain type="kvm">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <uuid>b4c1468d-9914-426a-9464-c1167de53632</uuid>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <name>instance-0000008c</name>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-65714861</nova:name>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 14:57:46</nova:creationTime>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:user uuid="ed2c9bd268d1491fa3484d86bcdb9ec6">tempest-TestInstancesWithCinderVolumes-1174033615-project-member</nova:user>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:project uuid="107c1f3b5b7b413d9a389ca1166e331f">tempest-TestInstancesWithCinderVolumes-1174033615</nova:project>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <nova:port uuid="7c572239-9b2e-493c-8be5-632f27cc634a">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <system>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="serial">b4c1468d-9914-426a-9464-c1167de53632</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="uuid">b4c1468d-9914-426a-9464-c1167de53632</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </system>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <os>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </os>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <features>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </features>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </clock>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  <devices>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b4c1468d-9914-426a-9464-c1167de53632_disk.config">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-002e39e3-1bec-4033-aca2-f1428e495087">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <serial>002e39e3-1bec-4033-aca2-f1428e495087</serial>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:d9:6a:1f"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <target dev="tap7c572239-9b"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </interface>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/console.log" append="off"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </serial>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <video>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </video>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </rng>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 09:57:47 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 09:57:47 np0005588919 nova_compute[225855]:  </devices>
Jan 20 09:57:47 np0005588919 nova_compute[225855]: </domain>
Jan 20 09:57:47 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.765 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Preparing to wait for external event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.766 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.766 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.766 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.768 225859 DEBUG nova.virt.libvirt.vif [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T14:57:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-65714861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-65714861',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-8l7rw241',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T14:57:41Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=b4c1468d-9914-426a-9464-c1167de53632,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.768 225859 DEBUG nova.network.os_vif_util [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.770 225859 DEBUG nova.network.os_vif_util [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.771 225859 DEBUG os_vif [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.772 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.773 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.774 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.778 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.779 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c572239-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.781 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c572239-9b, col_values=(('external_ids', {'iface-id': '7c572239-9b2e-493c-8be5-632f27cc634a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:6a:1f', 'vm-uuid': 'b4c1468d-9914-426a-9464-c1167de53632'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:47 np0005588919 NetworkManager[49104]: <info>  [1768921067.7845] manager: (tap7c572239-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.789 225859 DEBUG nova.compute.manager [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.790 225859 DEBUG oslo_concurrency.lockutils [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.791 225859 DEBUG oslo_concurrency.lockutils [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.791 225859 DEBUG oslo_concurrency.lockutils [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.792 225859 DEBUG nova.compute.manager [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] No waiting events found dispatching network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.792 225859 WARNING nova.compute.manager [req-929ba64e-9ddf-49e3-890a-8757a6aa0e6c req-bd027a3e-3670-4dc9-a6ae-08ac1e85deb8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received unexpected event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca for instance with vm_state active and task_state None.#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.795 225859 INFO os_vif [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b')#033[00m
Jan 20 09:57:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1045157438' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.838 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.871 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.872 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.873 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d9:6a:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.874 225859 INFO nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Using config drive#033[00m
Jan 20 09:57:47 np0005588919 nova_compute[225855]: 2026-01-20 14:57:47.900 225859 DEBUG nova.storage.rbd_utils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image b4c1468d-9914-426a-9464-c1167de53632_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.032 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.033 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.037 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.037 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:57:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:48.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.225 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.227 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4229MB free_disk=20.876060485839844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.228 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.228 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.480 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 1ebdefed-0903-4d72-b78d-912666c5ce61 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.481 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance b4c1468d-9914-426a-9464-c1167de53632 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.482 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.482 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:57:48 np0005588919 nova_compute[225855]: 2026-01-20 14:57:48.624 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:57:49 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3301246247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.107 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.115 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.142 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.324 225859 INFO nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Creating config drive at /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.330 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp89qkppa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.488 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp89qkppa" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:49.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.533 225859 DEBUG nova.storage.rbd_utils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] rbd image b4c1468d-9914-426a-9464-c1167de53632_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.538 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config b4c1468d-9914-426a-9464-c1167de53632_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.608 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.609 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.753 225859 DEBUG oslo_concurrency.processutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config b4c1468d-9914-426a-9464-c1167de53632_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.754 225859 INFO nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Deleting local config drive /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632/disk.config because it was imported into RBD.#033[00m
Jan 20 09:57:49 np0005588919 kernel: tap7c572239-9b: entered promiscuous mode
Jan 20 09:57:49 np0005588919 NetworkManager[49104]: <info>  [1768921069.8320] manager: (tap7c572239-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Jan 20 09:57:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:49Z|00559|binding|INFO|Claiming lport 7c572239-9b2e-493c-8be5-632f27cc634a for this chassis.
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:49Z|00560|binding|INFO|7c572239-9b2e-493c-8be5-632f27cc634a: Claiming fa:16:3e:d9:6a:1f 10.100.0.9
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.845 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:6a:1f 10.100.0.9'], port_security=['fa:16:3e:d9:6a:1f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b4c1468d-9914-426a-9464-c1167de53632', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7c572239-9b2e-493c-8be5-632f27cc634a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.847 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7c572239-9b2e-493c-8be5-632f27cc634a in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 bound to our chassis#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.849 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d966e1-4d26-414a-920e-0be2d77abb59#033[00m
Jan 20 09:57:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:49Z|00561|binding|INFO|Setting lport 7c572239-9b2e-493c-8be5-632f27cc634a ovn-installed in OVS
Jan 20 09:57:49 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:49Z|00562|binding|INFO|Setting lport 7c572239-9b2e-493c-8be5-632f27cc634a up in Southbound
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.854 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.869 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ecfad7-d166-45d4-92b8-47b3eec4a47e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 systemd-machined[194361]: New machine qemu-67-instance-0000008c.
Jan 20 09:57:49 np0005588919 systemd[1]: Started Virtual Machine qemu-67-instance-0000008c.
Jan 20 09:57:49 np0005588919 systemd-udevd[280396]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.907 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3144f15d-a2df-4885-8a7a-55b546f2d681]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.911 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[295d6385-ed6e-4a9f-a800-d636030ce5f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 NetworkManager[49104]: <info>  [1768921069.9172] device (tap7c572239-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 09:57:49 np0005588919 NetworkManager[49104]: <info>  [1768921069.9181] device (tap7c572239-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.942 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[096d4bbe-0918-4f13-b3cd-abf16c66b5c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.959 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6341f2-ed4e-44ee-a3a2-09ae3f7a0ddf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610366, 'reachable_time': 35351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280404, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.990 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b1e8aaf-ce4e-4711-894c-b24e09c01a6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58d966e1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610378, 'tstamp': 610378}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280407, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58d966e1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610381, 'tstamp': 610381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280407, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.992 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:49 np0005588919 nova_compute[225855]: 2026-01-20 14:57:49.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.994 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d966e1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.995 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.995 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d966e1-40, col_values=(('external_ids', {'iface-id': '1623097d-35b0-4d71-9dc2-c4d659492102'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:57:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:57:49.995 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:57:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:50.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:50 np0005588919 nova_compute[225855]: 2026-01-20 14:57:50.698 225859 DEBUG nova.compute.manager [req-cf40b2e5-cf3b-44ca-a3eb-1b9dab98a6bc req-be2b7343-0f5f-46d3-b515-426e50b6ded3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:50 np0005588919 nova_compute[225855]: 2026-01-20 14:57:50.700 225859 DEBUG oslo_concurrency.lockutils [req-cf40b2e5-cf3b-44ca-a3eb-1b9dab98a6bc req-be2b7343-0f5f-46d3-b515-426e50b6ded3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:50 np0005588919 nova_compute[225855]: 2026-01-20 14:57:50.700 225859 DEBUG oslo_concurrency.lockutils [req-cf40b2e5-cf3b-44ca-a3eb-1b9dab98a6bc req-be2b7343-0f5f-46d3-b515-426e50b6ded3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:50 np0005588919 nova_compute[225855]: 2026-01-20 14:57:50.700 225859 DEBUG oslo_concurrency.lockutils [req-cf40b2e5-cf3b-44ca-a3eb-1b9dab98a6bc req-be2b7343-0f5f-46d3-b515-426e50b6ded3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:50 np0005588919 nova_compute[225855]: 2026-01-20 14:57:50.701 225859 DEBUG nova.compute.manager [req-cf40b2e5-cf3b-44ca-a3eb-1b9dab98a6bc req-be2b7343-0f5f-46d3-b515-426e50b6ded3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Processing event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.085 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921071.0852692, b4c1468d-9914-426a-9464-c1167de53632 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.086 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] VM Started (Lifecycle Event)#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.088 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.091 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.094 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] Instance spawned successfully.#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.094 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.159 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.163 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.198 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.199 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.199 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.200 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.200 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.200 225859 DEBUG nova.virt.libvirt.driver [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.252 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.253 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921071.0862257, b4c1468d-9914-426a-9464-c1167de53632 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.253 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] VM Paused (Lifecycle Event)#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.360 225859 INFO nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Took 7.92 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.361 225859 DEBUG nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.427 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.431 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921071.0908244, b4c1468d-9914-426a-9464-c1167de53632 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.432 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] VM Resumed (Lifecycle Event)#033[00m
Jan 20 09:57:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:51.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.571 225859 INFO nova.compute.manager [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Took 10.87 seconds to build instance.#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.604 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.609 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 09:57:51 np0005588919 nova_compute[225855]: 2026-01-20 14:57:51.649 225859 DEBUG oslo_concurrency.lockutils [None req-0a7677da-1522-4eea-a664-8af5ee58a977 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:52.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.610 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.611 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:57:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.667 225859 DEBUG nova.network.neutron [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.668 225859 DEBUG nova.network.neutron [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.686 225859 DEBUG oslo_concurrency.lockutils [req-f4ced9dd-9f62-47e7-9fc5-556d98108fde req-dbecc6d7-69fd-41aa-9235-62d0c43035af 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.920 225859 DEBUG nova.compute.manager [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.921 225859 DEBUG oslo_concurrency.lockutils [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.921 225859 DEBUG oslo_concurrency.lockutils [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.922 225859 DEBUG oslo_concurrency.lockutils [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.922 225859 DEBUG nova.compute.manager [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] No waiting events found dispatching network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:57:52 np0005588919 nova_compute[225855]: 2026-01-20 14:57:52.922 225859 WARNING nova.compute.manager [req-07e5061e-243d-4cb0-a7f8-0d2b32f10e4f req-804f87cf-26af-491a-82bf-76d703a8605f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received unexpected event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a for instance with vm_state active and task_state None.#033[00m
Jan 20 09:57:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:53.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:54.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 09:57:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 10K writes, 52K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 10K writes, 10K syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1854 writes, 9385 keys, 1854 commit groups, 1.0 writes per commit group, ingest: 17.68 MB, 0.03 MB/s#012Interval WAL: 1855 writes, 1855 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     73.2      0.86              0.23        31    0.028       0      0       0.0       0.0#012  L6      1/0    9.87 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.4    106.8     89.5      3.06              0.90        30    0.102    183K    16K       0.0       0.0#012 Sum      1/0    9.87 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.4     83.4     85.9      3.92              1.13        61    0.064    183K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.5     90.0     90.7      0.93              0.26        14    0.066     54K   3667       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    106.8     89.5      3.06              0.90        30    0.102    183K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     73.3      0.86              0.23        30    0.029       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.061, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.33 GB write, 0.09 MB/s write, 0.32 GB read, 0.09 MB/s read, 3.9 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 38.25 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.00042 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2201,36.88 MB,12.1318%) FilterBlock(61,519.36 KB,0.166838%) IndexBlock(61,877.77 KB,0.281971%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 09:57:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:55.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:57:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2303621507' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:57:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:56.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:56 np0005588919 nova_compute[225855]: 2026-01-20 14:57:56.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:57:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:57.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:57:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:57:57 np0005588919 nova_compute[225855]: 2026-01-20 14:57:57.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:57:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:57:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:57:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:57:59 np0005588919 podman[280505]: 2026-01-20 14:57:59.055937531 +0000 UTC m=+0.069734088 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 09:57:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:57:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:57:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:57:59.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:57:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:59Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:b7:b1 10.100.0.10
Jan 20 09:57:59 np0005588919 ovn_controller[130490]: 2026-01-20T14:57:59Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:b7:b1 10.100.0.10
Jan 20 09:58:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:00.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:00 np0005588919 nova_compute[225855]: 2026-01-20 14:58:00.153 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:00 np0005588919 NetworkManager[49104]: <info>  [1768921080.1544] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 20 09:58:00 np0005588919 NetworkManager[49104]: <info>  [1768921080.1555] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 20 09:58:00 np0005588919 nova_compute[225855]: 2026-01-20 14:58:00.351 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:00Z|00563|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:00 np0005588919 nova_compute[225855]: 2026-01-20 14:58:00.374 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588919 nova_compute[225855]: 2026-01-20 14:58:01.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588919 nova_compute[225855]: 2026-01-20 14:58:01.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:01.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:02.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:02 np0005588919 nova_compute[225855]: 2026-01-20 14:58:02.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:03.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:04.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:04Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:6a:1f 10.100.0.9
Jan 20 09:58:04 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:04Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:6a:1f 10.100.0.9
Jan 20 09:58:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:05.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:06.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:06 np0005588919 nova_compute[225855]: 2026-01-20 14:58:06.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.340 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.341 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.429 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.430 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.430 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] 1ebdefed-0903-4d72-b78d-912666c5ce61 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.431 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] b4c1468d-9914-426a-9464-c1167de53632 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.431 225859 WARNING nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.431 225859 WARNING nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.431 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Removable base files: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.431 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.432 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.432 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.432 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.432 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 20 09:58:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:07.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:07 np0005588919 nova_compute[225855]: 2026-01-20 14:58:07.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:08.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:09.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:10.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:11 np0005588919 nova_compute[225855]: 2026-01-20 14:58:11.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:11.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:12.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:12 np0005588919 nova_compute[225855]: 2026-01-20 14:58:12.624 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:12 np0005588919 nova_compute[225855]: 2026-01-20 14:58:12.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:13.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:14.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:15.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 09:58:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:16.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 09:58:16 np0005588919 nova_compute[225855]: 2026-01-20 14:58:16.281 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:16.419 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:16.419 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:16.420 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:17.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:17 np0005588919 nova_compute[225855]: 2026-01-20 14:58:17.796 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:18 np0005588919 podman[280584]: 2026-01-20 14:58:18.036509511 +0000 UTC m=+0.082656103 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 09:58:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:18.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:58:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3000352245' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:58:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:58:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3000352245' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:58:18 np0005588919 nova_compute[225855]: 2026-01-20 14:58:18.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:19.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:20.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:21 np0005588919 nova_compute[225855]: 2026-01-20 14:58:21.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:58:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:21.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:58:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:22.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:22 np0005588919 nova_compute[225855]: 2026-01-20 14:58:22.514 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:22 np0005588919 nova_compute[225855]: 2026-01-20 14:58:22.798 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:23 np0005588919 nova_compute[225855]: 2026-01-20 14:58:23.523 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:23Z|00564|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:23 np0005588919 nova_compute[225855]: 2026-01-20 14:58:23.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:23.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:23Z|00565|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:23 np0005588919 nova_compute[225855]: 2026-01-20 14:58:23.854 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:24.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:25.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:26.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:26 np0005588919 nova_compute[225855]: 2026-01-20 14:58:26.284 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:27.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:27 np0005588919 nova_compute[225855]: 2026-01-20 14:58:27.799 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:28.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:29.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:30 np0005588919 podman[280617]: 2026-01-20 14:58:30.00074224 +0000 UTC m=+0.047201622 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 09:58:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:30.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:30 np0005588919 nova_compute[225855]: 2026-01-20 14:58:30.772 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:30 np0005588919 nova_compute[225855]: 2026-01-20 14:58:30.772 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:30 np0005588919 nova_compute[225855]: 2026-01-20 14:58:30.793 225859 DEBUG nova.objects.instance [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:30 np0005588919 nova_compute[225855]: 2026-01-20 14:58:30.855 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.151 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.151 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.152 225859 INFO nova.compute.manager [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attaching volume 3381d324-93a9-4d2f-ab25-8460bb2b8e95 to /dev/vdb#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.460 225859 DEBUG os_brick.utils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.461 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.474 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.475 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[f435e800-c679-496d-ad9f-991d52216fd7]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.477 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.486 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.486 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2a7265-60d6-4baf-af34-d017ced7f9a5]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.488 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.496 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.497 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[dd05cb19-022d-486e-ad5d-a521073f1907]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.499 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[6eaf37dc-44df-4213-995e-d13dd4b7e4b3]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.499 225859 DEBUG oslo_concurrency.processutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.524 225859 DEBUG oslo_concurrency.processutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.526 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.526 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.527 225859 DEBUG os_brick.initiator.connectors.lightos [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.527 225859 DEBUG os_brick.utils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (65ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:58:31 np0005588919 nova_compute[225855]: 2026-01-20 14:58:31.527 225859 DEBUG nova.virt.block_device [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating existing volume attachment record: 33fc5a49-bcff-4298-84b6-8c4fe61f57ab _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:58:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:31.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:32.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.426 225859 DEBUG nova.objects.instance [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.464 225859 DEBUG nova.virt.libvirt.driver [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attempting to attach volume 3381d324-93a9-4d2f-ab25-8460bb2b8e95 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.467 225859 DEBUG nova.virt.libvirt.guest [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-3381d324-93a9-4d2f-ab25-8460bb2b8e95">
Jan 20 09:58:32 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:58:32 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:58:32 np0005588919 nova_compute[225855]:  <serial>3381d324-93a9-4d2f-ab25-8460bb2b8e95</serial>
Jan 20 09:58:32 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:58:32 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:58:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.678 225859 DEBUG nova.virt.libvirt.driver [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.679 225859 DEBUG nova.virt.libvirt.driver [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.680 225859 DEBUG nova.virt.libvirt.driver [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.680 225859 DEBUG nova.virt.libvirt.driver [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:cd:b7:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:58:32 np0005588919 nova_compute[225855]: 2026-01-20 14:58:32.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:33 np0005588919 nova_compute[225855]: 2026-01-20 14:58:33.013 225859 DEBUG oslo_concurrency.lockutils [None req-0af08aac-4274-42e8-a15a-78e36ad79581 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:33.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:34.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.194 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.197 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.221 225859 DEBUG nova.objects.instance [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.309 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.575 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.576 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.576 225859 INFO nova.compute.manager [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attaching volume 0d487092-de99-40b0-be3f-425947d7010c to /dev/vdc#033[00m
Jan 20 09:58:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:35.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.742 225859 DEBUG os_brick.utils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.744 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.756 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.756 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[5cef7ea8-bbd6-4261-895d-5eaec3439c85]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.758 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.768 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.768 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[4789073c-dc63-4300-8a67-09379cd2c259]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.770 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.780 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.780 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4b15bd-4e02-43d1-9f9d-c89df80f63b3]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.782 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[970f23d2-4f2a-4d38-bc79-34d366de9072]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.782 225859 DEBUG oslo_concurrency.processutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.810 225859 DEBUG oslo_concurrency.processutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.813 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.813 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.813 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.814 225859 DEBUG os_brick.utils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (70ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:58:35 np0005588919 nova_compute[225855]: 2026-01-20 14:58:35.814 225859 DEBUG nova.virt.block_device [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating existing volume attachment record: aa9114da-45ac-4223-b30c-7a489822200c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:58:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:36.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:36 np0005588919 nova_compute[225855]: 2026-01-20 14:58:36.744 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:37.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:37 np0005588919 nova_compute[225855]: 2026-01-20 14:58:37.843 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:37 np0005588919 nova_compute[225855]: 2026-01-20 14:58:37.849 225859 DEBUG nova.objects.instance [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:37 np0005588919 nova_compute[225855]: 2026-01-20 14:58:37.880 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attempting to attach volume 0d487092-de99-40b0-be3f-425947d7010c with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:58:37 np0005588919 nova_compute[225855]: 2026-01-20 14:58:37.883 225859 DEBUG nova.virt.libvirt.guest [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-0d487092-de99-40b0-be3f-425947d7010c">
Jan 20 09:58:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:58:37 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:58:37 np0005588919 nova_compute[225855]:  <serial>0d487092-de99-40b0-be3f-425947d7010c</serial>
Jan 20 09:58:37 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:58:37 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:58:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:37.934 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:58:37 np0005588919 nova_compute[225855]: 2026-01-20 14:58:37.934 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:37.936 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:58:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:58:37.937 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.016 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.017 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.017 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.017 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.017 225859 DEBUG nova.virt.libvirt.driver [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:cd:b7:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:58:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:38.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:38 np0005588919 nova_compute[225855]: 2026-01-20 14:58:38.432 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:39 np0005588919 nova_compute[225855]: 2026-01-20 14:58:39.030 225859 DEBUG oslo_concurrency.lockutils [None req-12d62678-a738-4b9a-89aa-a51e996a5b41 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:39 np0005588919 nova_compute[225855]: 2026-01-20 14:58:39.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:39.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:40.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.880060775 +0000 UTC m=+0.042030676 container create cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 09:58:40 np0005588919 systemd[1]: Started libpod-conmon-cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650.scope.
Jan 20 09:58:40 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.86037648 +0000 UTC m=+0.022346411 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.970988751 +0000 UTC m=+0.132958662 container init cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.978823952 +0000 UTC m=+0.140793863 container start cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.982440894 +0000 UTC m=+0.144410815 container attach cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 20 09:58:40 np0005588919 hopeful_bartik[281030]: 167 167
Jan 20 09:58:40 np0005588919 systemd[1]: libpod-cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650.scope: Deactivated successfully.
Jan 20 09:58:40 np0005588919 conmon[281030]: conmon cc7b8f5914dd70d3c111 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650.scope/container/memory.events
Jan 20 09:58:40 np0005588919 podman[281014]: 2026-01-20 14:58:40.987830636 +0000 UTC m=+0.149800537 container died cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 09:58:41 np0005588919 systemd[1]: var-lib-containers-storage-overlay-36f29b6d777380cd9507e337a3c5538750b36403649d8f3b104c6741d0c68e4b-merged.mount: Deactivated successfully.
Jan 20 09:58:41 np0005588919 podman[281014]: 2026-01-20 14:58:41.031422086 +0000 UTC m=+0.193391987 container remove cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=hopeful_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 09:58:41 np0005588919 systemd[1]: libpod-conmon-cc7b8f5914dd70d3c1118742ed90c9d05585301c0188a641cfa384dca5a7b650.scope: Deactivated successfully.
Jan 20 09:58:41 np0005588919 podman[281054]: 2026-01-20 14:58:41.203257904 +0000 UTC m=+0.047069660 container create 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 09:58:41 np0005588919 systemd[1]: Started libpod-conmon-23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b.scope.
Jan 20 09:58:41 np0005588919 systemd[1]: Started libcrun container.
Jan 20 09:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbfe4ec9ae8e87949c3114b277088125c28f9a2cdb29103e7a54dd08f6848943/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 09:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbfe4ec9ae8e87949c3114b277088125c28f9a2cdb29103e7a54dd08f6848943/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 09:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbfe4ec9ae8e87949c3114b277088125c28f9a2cdb29103e7a54dd08f6848943/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 09:58:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbfe4ec9ae8e87949c3114b277088125c28f9a2cdb29103e7a54dd08f6848943/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 09:58:41 np0005588919 podman[281054]: 2026-01-20 14:58:41.183059604 +0000 UTC m=+0.026871380 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 09:58:41 np0005588919 podman[281054]: 2026-01-20 14:58:41.277112567 +0000 UTC m=+0.120924343 container init 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 20 09:58:41 np0005588919 podman[281054]: 2026-01-20 14:58:41.282785437 +0000 UTC m=+0.126597193 container start 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 20 09:58:41 np0005588919 podman[281054]: 2026-01-20 14:58:41.286999646 +0000 UTC m=+0.130811402 container attach 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 09:58:41 np0005588919 nova_compute[225855]: 2026-01-20 14:58:41.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:41 np0005588919 nova_compute[225855]: 2026-01-20 14:58:41.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:58:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:41.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:41 np0005588919 nova_compute[225855]: 2026-01-20 14:58:41.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:42.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:42 np0005588919 nova_compute[225855]: 2026-01-20 14:58:42.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:42 np0005588919 nova_compute[225855]: 2026-01-20 14:58:42.343 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:58:42 np0005588919 nova_compute[225855]: 2026-01-20 14:58:42.343 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 09:58:42 np0005588919 interesting_ride[281070]: [
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:    {
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "available": false,
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "ceph_device": false,
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "lsm_data": {},
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "lvs": [],
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "path": "/dev/sr0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "rejected_reasons": [
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "Insufficient space (<5GB)",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "Has a FileSystem"
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        ],
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        "sys_api": {
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "actuators": null,
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "device_nodes": "sr0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "devname": "sr0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "human_readable_size": "482.00 KB",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "id_bus": "ata",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "model": "QEMU DVD-ROM",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "nr_requests": "2",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "parent": "/dev/sr0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "partitions": {},
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "path": "/dev/sr0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "removable": "1",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "rev": "2.5+",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "ro": "0",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "rotational": "1",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "sas_address": "",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "sas_device_handle": "",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "scheduler_mode": "mq-deadline",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "sectors": 0,
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "sectorsize": "2048",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "size": 493568.0,
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "support_discard": "2048",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "type": "disk",
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:            "vendor": "QEMU"
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:        }
Jan 20 09:58:42 np0005588919 interesting_ride[281070]:    }
Jan 20 09:58:42 np0005588919 interesting_ride[281070]: ]
Jan 20 09:58:42 np0005588919 systemd[1]: libpod-23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b.scope: Deactivated successfully.
Jan 20 09:58:42 np0005588919 podman[281054]: 2026-01-20 14:58:42.485889889 +0000 UTC m=+1.329701645 container died 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 20 09:58:42 np0005588919 systemd[1]: libpod-23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b.scope: Consumed 1.183s CPU time.
Jan 20 09:58:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:42 np0005588919 systemd[1]: var-lib-containers-storage-overlay-dbfe4ec9ae8e87949c3114b277088125c28f9a2cdb29103e7a54dd08f6848943-merged.mount: Deactivated successfully.
Jan 20 09:58:42 np0005588919 podman[281054]: 2026-01-20 14:58:42.747688285 +0000 UTC m=+1.591500041 container remove 23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=interesting_ride, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:58:42 np0005588919 systemd[1]: libpod-conmon-23078b88548d945c9f4e8597c26978ce3a7e785df42bedfca978cb6e87ecbd3b.scope: Deactivated successfully.
Jan 20 09:58:42 np0005588919 nova_compute[225855]: 2026-01-20 14:58:42.845 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.058 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.058 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.059 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.059 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:43.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.615 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:43 np0005588919 NetworkManager[49104]: <info>  [1768921123.6158] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 20 09:58:43 np0005588919 NetworkManager[49104]: <info>  [1768921123.6170] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.868 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:43 np0005588919 ovn_controller[130490]: 2026-01-20T14:58:43Z|00566|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:58:43 np0005588919 nova_compute[225855]: 2026-01-20 14:58:43.893 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:44 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:44.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:45 np0005588919 nova_compute[225855]: 2026-01-20 14:58:45.006 225859 DEBUG nova.compute.manager [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:45 np0005588919 nova_compute[225855]: 2026-01-20 14:58:45.006 225859 DEBUG nova.compute.manager [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:45 np0005588919 nova_compute[225855]: 2026-01-20 14:58:45.007 225859 DEBUG oslo_concurrency.lockutils [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:45.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:46.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:46 np0005588919 nova_compute[225855]: 2026-01-20 14:58:46.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.514 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/579533939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.594 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.595 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.595 225859 DEBUG oslo_concurrency.lockutils [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.595 225859 DEBUG nova.network.neutron [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.596 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.597 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.597 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:47.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:47 np0005588919 nova_compute[225855]: 2026-01-20 14:58:47.848 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:48.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:49 np0005588919 podman[282214]: 2026-01-20 14:58:49.070755656 +0000 UTC m=+0.100107715 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.392 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.392 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.392 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.393 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.393 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.458 225859 DEBUG nova.compute.manager [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.459 225859 DEBUG nova.compute.manager [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.459 225859 DEBUG oslo_concurrency.lockutils [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:49.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:58:49 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2379891077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.856 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.942 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.943 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.947 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.948 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.948 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:49 np0005588919 nova_compute[225855]: 2026-01-20 14:58:49.948 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.109 225859 DEBUG nova.network.neutron [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.109 225859 DEBUG nova.network.neutron [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.116 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.117 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3872MB free_disk=20.94619369506836GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.117 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.118 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.127 225859 DEBUG oslo_concurrency.lockutils [req-d9775999-6ef5-477b-9731-7984c7264ff7 req-e178faaa-69bb-40e9-aa96-3cce2867e5bf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.128 225859 DEBUG oslo_concurrency.lockutils [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.128 225859 DEBUG nova.network.neutron [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:50.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.210 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 1ebdefed-0903-4d72-b78d-912666c5ce61 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.211 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance b4c1468d-9914-426a-9464-c1167de53632 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.211 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.211 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.303 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.377 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.377 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.404 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.429 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.506 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:58:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:58:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3942644152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.951 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.960 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:58:50 np0005588919 nova_compute[225855]: 2026-01-20 14:58:50.984 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:58:51 np0005588919 nova_compute[225855]: 2026-01-20 14:58:51.010 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:58:51 np0005588919 nova_compute[225855]: 2026-01-20 14:58:51.010 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.893s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:58:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:51.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:51 np0005588919 nova_compute[225855]: 2026-01-20 14:58:51.751 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:58:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:52.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:58:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.851 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.880 225859 DEBUG nova.compute.manager [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.880 225859 DEBUG nova.compute.manager [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.880 225859 DEBUG oslo_concurrency.lockutils [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.989 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:52 np0005588919 nova_compute[225855]: 2026-01-20 14:58:52.990 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.094 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.372 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.475 225859 DEBUG nova.network.neutron [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.476 225859 DEBUG nova.network.neutron [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.491 225859 DEBUG oslo_concurrency.lockutils [req-1228f766-6739-4e95-bcd3-57f32a6438c2 req-951e0959-c92a-4128-9f5d-3bfe55509b04 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.492 225859 DEBUG oslo_concurrency.lockutils [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:53 np0005588919 nova_compute[225855]: 2026-01-20 14:58:53.492 225859 DEBUG nova.network.neutron [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:53.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:58:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:54.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:55 np0005588919 nova_compute[225855]: 2026-01-20 14:58:55.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:58:55 np0005588919 nova_compute[225855]: 2026-01-20 14:58:55.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 09:58:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:55.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:56 np0005588919 nova_compute[225855]: 2026-01-20 14:58:56.030 225859 DEBUG nova.network.neutron [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:58:56 np0005588919 nova_compute[225855]: 2026-01-20 14:58:56.031 225859 DEBUG nova.network.neutron [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:58:56 np0005588919 nova_compute[225855]: 2026-01-20 14:58:56.047 225859 DEBUG oslo_concurrency.lockutils [req-60aaeb3a-d272-4037-a354-4bac3988ba44 req-ddf24908-4dd9-46f1-b484-1dc55b6d0ebd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:58:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:56.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:56 np0005588919 nova_compute[225855]: 2026-01-20 14:58:56.753 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 09:58:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:57.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 09:58:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:58:57 np0005588919 nova_compute[225855]: 2026-01-20 14:58:57.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:58:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:58:58.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.302 225859 DEBUG nova.compute.manager [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.302 225859 DEBUG nova.compute.manager [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.303 225859 DEBUG oslo_concurrency.lockutils [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.303 225859 DEBUG oslo_concurrency.lockutils [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.303 225859 DEBUG nova.network.neutron [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.864 225859 DEBUG oslo_concurrency.lockutils [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.865 225859 DEBUG oslo_concurrency.lockutils [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:58:58 np0005588919 nova_compute[225855]: 2026-01-20 14:58:58.885 225859 INFO nova.compute.manager [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Detaching volume 3381d324-93a9-4d2f-ab25-8460bb2b8e95#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.257 225859 INFO nova.virt.block_device [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attempting to driver detach volume 3381d324-93a9-4d2f-ab25-8460bb2b8e95 from mountpoint /dev/vdb#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.264 225859 DEBUG nova.virt.libvirt.driver [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdb from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.265 225859 DEBUG nova.virt.libvirt.guest [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-3381d324-93a9-4d2f-ab25-8460bb2b8e95">
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <serial>3381d324-93a9-4d2f-ab25-8460bb2b8e95</serial>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:58:59 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.272 225859 INFO nova.virt.libvirt.driver [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the persistent domain config.#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.273 225859 DEBUG nova.virt.libvirt.driver [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.273 225859 DEBUG nova.virt.libvirt.guest [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-3381d324-93a9-4d2f-ab25-8460bb2b8e95">
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <serial>3381d324-93a9-4d2f-ab25-8460bb2b8e95</serial>
Jan 20 09:58:59 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:58:59 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:58:59 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.332 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921139.3320618, 1ebdefed-0903-4d72-b78d-912666c5ce61 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.334 225859 DEBUG nova.virt.libvirt.driver [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 1ebdefed-0903-4d72-b78d-912666c5ce61 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.336 225859 INFO nova.virt.libvirt.driver [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the live domain config.#033[00m
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.609 225859 DEBUG nova.objects.instance [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:58:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:58:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:58:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:58:59.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:58:59 np0005588919 nova_compute[225855]: 2026-01-20 14:58:59.649 225859 DEBUG oslo_concurrency.lockutils [None req-56e6039d-e3ad-400d-9aed-443fde2a9d46 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:00.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:00 np0005588919 nova_compute[225855]: 2026-01-20 14:59:00.551 225859 DEBUG nova.network.neutron [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:00 np0005588919 nova_compute[225855]: 2026-01-20 14:59:00.551 225859 DEBUG nova.network.neutron [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:00 np0005588919 nova_compute[225855]: 2026-01-20 14:59:00.573 225859 DEBUG oslo_concurrency.lockutils [req-a03f3cd4-0774-4e61-afbb-563b287813be req-ba37c252-f184-4a83-bea2-a386e7d54318 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:00 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:00Z|00567|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:59:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2943226124' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:59:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:59:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2943226124' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:59:00 np0005588919 nova_compute[225855]: 2026-01-20 14:59:00.881 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:01 np0005588919 podman[282393]: 2026-01-20 14:59:01.014881828 +0000 UTC m=+0.057395551 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 09:59:01 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:01Z|00568|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:01 np0005588919 nova_compute[225855]: 2026-01-20 14:59:01.195 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:01.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:01 np0005588919 nova_compute[225855]: 2026-01-20 14:59:01.756 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:02.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:02 np0005588919 nova_compute[225855]: 2026-01-20 14:59:02.878 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.271 225859 DEBUG oslo_concurrency.lockutils [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.271 225859 DEBUG oslo_concurrency.lockutils [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.327 225859 INFO nova.compute.manager [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Detaching volume 0d487092-de99-40b0-be3f-425947d7010c#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.402 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.479 225859 INFO nova.virt.block_device [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Attempting to driver detach volume 0d487092-de99-40b0-be3f-425947d7010c from mountpoint /dev/vdc#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.487 225859 DEBUG nova.virt.libvirt.driver [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdc from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.488 225859 DEBUG nova.virt.libvirt.guest [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-0d487092-de99-40b0-be3f-425947d7010c">
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <serial>0d487092-de99-40b0-be3f-425947d7010c</serial>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.494 225859 INFO nova.virt.libvirt.driver [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the persistent domain config.#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.495 225859 DEBUG nova.virt.libvirt.driver [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.495 225859 DEBUG nova.virt.libvirt.guest [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-0d487092-de99-40b0-be3f-425947d7010c">
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <serial>0d487092-de99-40b0-be3f-425947d7010c</serial>
Jan 20 09:59:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:59:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.557 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921143.5572193, 1ebdefed-0903-4d72-b78d-912666c5ce61 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.558 225859 DEBUG nova.virt.libvirt.driver [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 1ebdefed-0903-4d72-b78d-912666c5ce61 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.560 225859 INFO nova.virt.libvirt.driver [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance 1ebdefed-0903-4d72-b78d-912666c5ce61 from the live domain config.#033[00m
Jan 20 09:59:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:03.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.825 225859 DEBUG nova.objects.instance [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:03 np0005588919 nova_compute[225855]: 2026-01-20 14:59:03.875 225859 DEBUG oslo_concurrency.lockutils [None req-500eed47-730e-41d1-8c78-dcb0ded30f39 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:04.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:05.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:05 np0005588919 nova_compute[225855]: 2026-01-20 14:59:05.824 225859 DEBUG nova.compute.manager [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:05 np0005588919 nova_compute[225855]: 2026-01-20 14:59:05.825 225859 DEBUG nova.compute.manager [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:05 np0005588919 nova_compute[225855]: 2026-01-20 14:59:05.825 225859 DEBUG oslo_concurrency.lockutils [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:05 np0005588919 nova_compute[225855]: 2026-01-20 14:59:05.825 225859 DEBUG oslo_concurrency.lockutils [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:05 np0005588919 nova_compute[225855]: 2026-01-20 14:59:05.826 225859 DEBUG nova.network.neutron [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:06.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:06 np0005588919 nova_compute[225855]: 2026-01-20 14:59:06.758 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:07.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:07 np0005588919 nova_compute[225855]: 2026-01-20 14:59:07.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:08.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:08 np0005588919 nova_compute[225855]: 2026-01-20 14:59:08.396 225859 DEBUG nova.network.neutron [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:08 np0005588919 nova_compute[225855]: 2026-01-20 14:59:08.397 225859 DEBUG nova.network.neutron [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:08Z|00569|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:08 np0005588919 nova_compute[225855]: 2026-01-20 14:59:08.517 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:08 np0005588919 nova_compute[225855]: 2026-01-20 14:59:08.520 225859 DEBUG oslo_concurrency.lockutils [req-0e40f913-f074-49f7-b0da-ea75d7bbfac4 req-c9f2fdc6-19d0-4b1f-9289-2b31e42ea5e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:08 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:08Z|00570|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:08 np0005588919 nova_compute[225855]: 2026-01-20 14:59:08.702 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:09.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:10.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:11 np0005588919 nova_compute[225855]: 2026-01-20 14:59:11.579 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:11 np0005588919 nova_compute[225855]: 2026-01-20 14:59:11.579 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:11 np0005588919 nova_compute[225855]: 2026-01-20 14:59:11.602 225859 DEBUG nova.objects.instance [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:11.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:11 np0005588919 nova_compute[225855]: 2026-01-20 14:59:11.668 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:11 np0005588919 nova_compute[225855]: 2026-01-20 14:59:11.759 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.058 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.059 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.059 225859 INFO nova.compute.manager [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attaching volume b0619b28-88eb-4051-9e30-36100f39c117 to /dev/vdb#033[00m
Jan 20 09:59:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:12.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.279 225859 DEBUG os_brick.utils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.280 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.290 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.290 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[c6cb6dab-e494-44c3-9c62-78fa6b4811a7]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.292 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.299 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.299 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[984c0414-3218-472e-b16f-f2a8798187ce]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.300 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.307 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.307 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[918b5cf1-8bae-4c41-9784-d406be505af9]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.309 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[69882a38-eab2-471f-bddb-a8d00c60d926]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.309 225859 DEBUG oslo_concurrency.processutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.334 225859 DEBUG oslo_concurrency.processutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.336 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.336 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.337 225859 DEBUG os_brick.initiator.connectors.lightos [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.337 225859 DEBUG os_brick.utils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (58ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.338 225859 DEBUG nova.virt.block_device [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating existing volume attachment record: 17fc8ff5-edb4-4dc8-aba8-9dd268a5f344 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:59:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:12 np0005588919 nova_compute[225855]: 2026-01-20 14:59:12.882 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/19077082' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.111 225859 DEBUG nova.objects.instance [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.132 225859 DEBUG nova.virt.libvirt.driver [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attempting to attach volume b0619b28-88eb-4051-9e30-36100f39c117 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.134 225859 DEBUG nova.virt.libvirt.guest [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-b0619b28-88eb-4051-9e30-36100f39c117">
Jan 20 09:59:13 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:59:13 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:59:13 np0005588919 nova_compute[225855]:  <serial>b0619b28-88eb-4051-9e30-36100f39c117</serial>
Jan 20 09:59:13 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:13 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.249 225859 DEBUG nova.virt.libvirt.driver [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.250 225859 DEBUG nova.virt.libvirt.driver [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.250 225859 DEBUG nova.virt.libvirt.driver [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:13 np0005588919 nova_compute[225855]: 2026-01-20 14:59:13.250 225859 DEBUG nova.virt.libvirt.driver [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d9:6a:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:59:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:59:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/857366021' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:59:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:59:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/857366021' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:59:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:13.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:14.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:15.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:16.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:16 np0005588919 nova_compute[225855]: 2026-01-20 14:59:16.381 225859 DEBUG oslo_concurrency.lockutils [None req-6570a512-ed7a-403c-8c93-7866d8503ce8 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 4.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:16.419 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:16.420 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:16.421 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:16 np0005588919 nova_compute[225855]: 2026-01-20 14:59:16.762 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:16 np0005588919 nova_compute[225855]: 2026-01-20 14:59:16.888 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:17.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:17 np0005588919 nova_compute[225855]: 2026-01-20 14:59:17.884 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:18.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.378 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.379 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.446 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.446 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid b4c1468d-9914-426a-9464-c1167de53632 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.447 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.447 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.447 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.455 225859 DEBUG nova.objects.instance [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.468 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.500 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.500 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.545 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.927 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.928 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:18 np0005588919 nova_compute[225855]: 2026-01-20 14:59:18.928 225859 INFO nova.compute.manager [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attaching volume ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66 to /dev/vdc#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.115 225859 DEBUG os_brick.utils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.116 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.125 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.126 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[3f502614-d0bb-472e-9503-40ff11cd9649]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.127 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.133 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.134 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[f187c59e-974e-4864-8ccb-6bcc5c2f8912]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.135 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.142 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.142 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[62123aa2-3a74-4f07-91e9-26d55759b95d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.143 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[01a2bcaf-e2a4-41a2-b562-a2e49f499352]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.144 225859 DEBUG oslo_concurrency.processutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.171 225859 DEBUG oslo_concurrency.processutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.173 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.173 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.173 225859 DEBUG os_brick.initiator.connectors.lightos [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.174 225859 DEBUG os_brick.utils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] <== get_connector_properties: return (58ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 09:59:19 np0005588919 nova_compute[225855]: 2026-01-20 14:59:19.174 225859 DEBUG nova.virt.block_device [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating existing volume attachment record: c891cf33-bcd1-4101-99a6-243f9f47c95b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 09:59:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:19.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:20 np0005588919 podman[282512]: 2026-01-20 14:59:20.065943465 +0000 UTC m=+0.101552066 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 20 09:59:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 09:59:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2904376589' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 09:59:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:20.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.235 225859 DEBUG nova.objects.instance [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.264 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attempting to attach volume ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.266 225859 DEBUG nova.virt.libvirt.guest [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66">
Jan 20 09:59:20 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 09:59:20 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  </auth>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:59:20 np0005588919 nova_compute[225855]:  <serial>ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66</serial>
Jan 20 09:59:20 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:20 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.473 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.474 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.474 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.474 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.474 225859 DEBUG nova.virt.libvirt.driver [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] No VIF found with MAC fa:16:3e:d9:6a:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 09:59:20 np0005588919 nova_compute[225855]: 2026-01-20 14:59:20.893 225859 DEBUG oslo_concurrency.lockutils [None req-12e9d29c-04e8-4082-a7d9-9ccfe2314a89 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:21.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:21 np0005588919 nova_compute[225855]: 2026-01-20 14:59:21.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:22.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:22 np0005588919 nova_compute[225855]: 2026-01-20 14:59:22.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:23 np0005588919 nova_compute[225855]: 2026-01-20 14:59:23.530 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:23 np0005588919 NetworkManager[49104]: <info>  [1768921163.5311] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Jan 20 09:59:23 np0005588919 NetworkManager[49104]: <info>  [1768921163.5318] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 20 09:59:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:23.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:23 np0005588919 nova_compute[225855]: 2026-01-20 14:59:23.701 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:23 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:23Z|00571|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:23 np0005588919 nova_compute[225855]: 2026-01-20 14:59:23.719 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:24 np0005588919 nova_compute[225855]: 2026-01-20 14:59:24.168 225859 DEBUG nova.compute.manager [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:24 np0005588919 nova_compute[225855]: 2026-01-20 14:59:24.168 225859 DEBUG nova.compute.manager [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:24 np0005588919 nova_compute[225855]: 2026-01-20 14:59:24.169 225859 DEBUG oslo_concurrency.lockutils [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:24 np0005588919 nova_compute[225855]: 2026-01-20 14:59:24.169 225859 DEBUG oslo_concurrency.lockutils [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:24 np0005588919 nova_compute[225855]: 2026-01-20 14:59:24.169 225859 DEBUG nova.network.neutron [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:24.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:25.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:25 np0005588919 nova_compute[225855]: 2026-01-20 14:59:25.960 225859 DEBUG nova.network.neutron [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:25 np0005588919 nova_compute[225855]: 2026-01-20 14:59:25.961 225859 DEBUG nova.network.neutron [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.000 225859 DEBUG oslo_concurrency.lockutils [req-682a67f2-50bf-43a3-a24d-d8492a545836 req-a973bf47-e1db-4a7b-b77c-bc6c46e3543a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:26.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.300 225859 DEBUG nova.compute.manager [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.300 225859 DEBUG nova.compute.manager [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.301 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.301 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.301 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:26 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:26Z|00572|binding|INFO|Releasing lport 1623097d-35b0-4d71-9dc2-c4d659492102 from this chassis (sb_readonly=0)
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.650 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:26 np0005588919 nova_compute[225855]: 2026-01-20 14:59:26.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:27.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.867 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.868 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.888 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.890 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.890 225859 DEBUG nova.compute.manager [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.890 225859 DEBUG nova.compute.manager [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.890 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.891 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:27 np0005588919 nova_compute[225855]: 2026-01-20 14:59:27.891 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:28.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:29 np0005588919 nova_compute[225855]: 2026-01-20 14:59:29.516 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:29 np0005588919 nova_compute[225855]: 2026-01-20 14:59:29.516 225859 DEBUG nova.network.neutron [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:29 np0005588919 nova_compute[225855]: 2026-01-20 14:59:29.551 225859 DEBUG oslo_concurrency.lockutils [req-6d8759a5-db1b-4688-b918-5a947368cf1f req-e6514c73-467e-46b9-8470-848fbf9e5278 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:29.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:30 np0005588919 nova_compute[225855]: 2026-01-20 14:59:30.208 225859 DEBUG nova.compute.manager [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:30 np0005588919 nova_compute[225855]: 2026-01-20 14:59:30.208 225859 DEBUG nova.compute.manager [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:30 np0005588919 nova_compute[225855]: 2026-01-20 14:59:30.209 225859 DEBUG oslo_concurrency.lockutils [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:30 np0005588919 nova_compute[225855]: 2026-01-20 14:59:30.209 225859 DEBUG oslo_concurrency.lockutils [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:30 np0005588919 nova_compute[225855]: 2026-01-20 14:59:30.209 225859 DEBUG nova.network.neutron [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:30.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:31.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:31 np0005588919 nova_compute[225855]: 2026-01-20 14:59:31.767 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:32 np0005588919 podman[282565]: 2026-01-20 14:59:32.003738749 +0000 UTC m=+0.048705545 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 09:59:32 np0005588919 nova_compute[225855]: 2026-01-20 14:59:32.097 225859 DEBUG nova.network.neutron [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:32 np0005588919 nova_compute[225855]: 2026-01-20 14:59:32.097 225859 DEBUG nova.network.neutron [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:32 np0005588919 nova_compute[225855]: 2026-01-20 14:59:32.149 225859 DEBUG oslo_concurrency.lockutils [req-a66d5549-f4d4-4c87-9c17-fc333baab02a req-9fbf6400-3219-4b7d-9055-230cb1d7330c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:32.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:32 np0005588919 nova_compute[225855]: 2026-01-20 14:59:32.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:32 np0005588919 nova_compute[225855]: 2026-01-20 14:59:32.890 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.248 225859 DEBUG oslo_concurrency.lockutils [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.249 225859 DEBUG oslo_concurrency.lockutils [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.264 225859 INFO nova.compute.manager [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Detaching volume b0619b28-88eb-4051-9e30-36100f39c117#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.474 225859 INFO nova.virt.block_device [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attempting to driver detach volume b0619b28-88eb-4051-9e30-36100f39c117 from mountpoint /dev/vdb#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.486 225859 DEBUG nova.virt.libvirt.driver [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdb from instance b4c1468d-9914-426a-9464-c1167de53632 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.487 225859 DEBUG nova.virt.libvirt.guest [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-b0619b28-88eb-4051-9e30-36100f39c117">
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <serial>b0619b28-88eb-4051-9e30-36100f39c117</serial>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:33 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.495 225859 INFO nova.virt.libvirt.driver [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance b4c1468d-9914-426a-9464-c1167de53632 from the persistent domain config.#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.495 225859 DEBUG nova.virt.libvirt.driver [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance b4c1468d-9914-426a-9464-c1167de53632 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.496 225859 DEBUG nova.virt.libvirt.guest [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-b0619b28-88eb-4051-9e30-36100f39c117">
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <serial>b0619b28-88eb-4051-9e30-36100f39c117</serial>
Jan 20 09:59:33 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 09:59:33 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:33 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.564 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921173.5637622, b4c1468d-9914-426a-9464-c1167de53632 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.566 225859 DEBUG nova.virt.libvirt.driver [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance b4c1468d-9914-426a-9464-c1167de53632 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.568 225859 INFO nova.virt.libvirt.driver [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdb from instance b4c1468d-9914-426a-9464-c1167de53632 from the live domain config.#033[00m
Jan 20 09:59:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:33.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.889 225859 DEBUG nova.objects.instance [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:33 np0005588919 nova_compute[225855]: 2026-01-20 14:59:33.937 225859 DEBUG oslo_concurrency.lockutils [None req-b80c68e5-e08d-494b-a7ad-c2e7cbbeaae4 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:34.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:35 np0005588919 nova_compute[225855]: 2026-01-20 14:59:35.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:35.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:36.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:36 np0005588919 nova_compute[225855]: 2026-01-20 14:59:36.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:36 np0005588919 nova_compute[225855]: 2026-01-20 14:59:36.860 225859 DEBUG oslo_concurrency.lockutils [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:36 np0005588919 nova_compute[225855]: 2026-01-20 14:59:36.861 225859 DEBUG oslo_concurrency.lockutils [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:36 np0005588919 nova_compute[225855]: 2026-01-20 14:59:36.874 225859 INFO nova.compute.manager [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Detaching volume ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.017 225859 INFO nova.virt.block_device [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Attempting to driver detach volume ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66 from mountpoint /dev/vdc#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.025 225859 DEBUG nova.virt.libvirt.driver [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Attempting to detach device vdc from instance b4c1468d-9914-426a-9464-c1167de53632 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.026 225859 DEBUG nova.virt.libvirt.guest [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66">
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <serial>ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66</serial>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:37 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.049 225859 INFO nova.virt.libvirt.driver [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance b4c1468d-9914-426a-9464-c1167de53632 from the persistent domain config.#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.049 225859 DEBUG nova.virt.libvirt.driver [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance b4c1468d-9914-426a-9464-c1167de53632 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.050 225859 DEBUG nova.virt.libvirt.guest [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66">
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  </source>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <target dev="vdc" bus="virtio"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <serial>ab9a0bc3-5c61-456d-9fc8-3cb8ce358e66</serial>
Jan 20 09:59:37 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 20 09:59:37 np0005588919 nova_compute[225855]: </disk>
Jan 20 09:59:37 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.131 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921177.1310332, b4c1468d-9914-426a-9464-c1167de53632 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.133 225859 DEBUG nova.virt.libvirt.driver [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance b4c1468d-9914-426a-9464-c1167de53632 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.135 225859 INFO nova.virt.libvirt.driver [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully detached device vdc from instance b4c1468d-9914-426a-9464-c1167de53632 from the live domain config.#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.347 225859 DEBUG nova.objects.instance [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'flavor' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.396 225859 DEBUG oslo_concurrency.lockutils [None req-cd488a83-9815-4af5-90e2-201aab21d807 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:37.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:37 np0005588919 nova_compute[225855]: 2026-01-20 14:59:37.892 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:38.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.855 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.885 225859 DEBUG nova.compute.manager [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.885 225859 DEBUG nova.compute.manager [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.886 225859 DEBUG oslo_concurrency.lockutils [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.886 225859 DEBUG oslo_concurrency.lockutils [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:38 np0005588919 nova_compute[225855]: 2026-01-20 14:59:38.886 225859 DEBUG nova.network.neutron [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 09:59:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/385370313' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 09:59:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 09:59:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/385370313' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 09:59:39 np0005588919 nova_compute[225855]: 2026-01-20 14:59:39.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:39.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:39 np0005588919 nova_compute[225855]: 2026-01-20 14:59:39.898 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:40.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:40 np0005588919 nova_compute[225855]: 2026-01-20 14:59:40.768 225859 DEBUG nova.network.neutron [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:40 np0005588919 nova_compute[225855]: 2026-01-20 14:59:40.769 225859 DEBUG nova.network.neutron [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:40 np0005588919 nova_compute[225855]: 2026-01-20 14:59:40.798 225859 DEBUG oslo_concurrency.lockutils [req-7530eee5-eefe-4483-a826-64dfa87a7d90 req-6fa91407-0234-4c78-8cbc-125bec97a4ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:41.017 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:41 np0005588919 nova_compute[225855]: 2026-01-20 14:59:41.017 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:41.018 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 09:59:41 np0005588919 nova_compute[225855]: 2026-01-20 14:59:41.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:41.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:41 np0005588919 nova_compute[225855]: 2026-01-20 14:59:41.771 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:42.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:42 np0005588919 nova_compute[225855]: 2026-01-20 14:59:42.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:42 np0005588919 nova_compute[225855]: 2026-01-20 14:59:42.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 09:59:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:42 np0005588919 nova_compute[225855]: 2026-01-20 14:59:42.894 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.586 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.586 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.587 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.641 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.658 225859 DEBUG nova.compute.manager [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.658 225859 DEBUG nova.compute.manager [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing instance network info cache due to event network-changed-7c572239-9b2e-493c-8be5-632f27cc634a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:43 np0005588919 nova_compute[225855]: 2026-01-20 14:59:43.659 225859 DEBUG oslo_concurrency.lockutils [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 09:59:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:44.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.520 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.596 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.596 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.597 225859 DEBUG oslo_concurrency.lockutils [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.597 225859 DEBUG nova.network.neutron [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Refreshing network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:45 np0005588919 nova_compute[225855]: 2026-01-20 14:59:45.598 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:45.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:46.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:46 np0005588919 nova_compute[225855]: 2026-01-20 14:59:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:46 np0005588919 nova_compute[225855]: 2026-01-20 14:59:46.773 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.014 225859 DEBUG nova.compute.manager [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.015 225859 DEBUG nova.compute.manager [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing instance network info cache due to event network-changed-3067803c-07f3-4a15-a5ee-47f9a770efca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.015 225859 DEBUG oslo_concurrency.lockutils [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.016 225859 DEBUG oslo_concurrency.lockutils [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.016 225859 DEBUG nova.network.neutron [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Refreshing network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.354 225859 DEBUG nova.network.neutron [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updated VIF entry in instance network info cache for port 7c572239-9b2e-493c-8be5-632f27cc634a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.355 225859 DEBUG nova.network.neutron [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [{"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.377 225859 DEBUG oslo_concurrency.lockutils [req-ade06d21-a8ba-44d2-8d65-9051a1ae8bdf req-2acbb262-49bf-49ef-b902-910eaa03f9a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c1468d-9914-426a-9464-c1167de53632" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 09:59:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:47.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 09:59:47 np0005588919 nova_compute[225855]: 2026-01-20 14:59:47.897 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:48.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:49.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:49 np0005588919 nova_compute[225855]: 2026-01-20 14:59:49.996 225859 DEBUG nova.network.neutron [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updated VIF entry in instance network info cache for port 3067803c-07f3-4a15-a5ee-47f9a770efca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 09:59:49 np0005588919 nova_compute[225855]: 2026-01-20 14:59:49.997 225859 DEBUG nova.network.neutron [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [{"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.016 225859 DEBUG oslo_concurrency.lockutils [req-dfb5baab-cb45-49f0-ab4a-92873c87e12d req-cd87dfdc-674d-4cde-932a-7f89d3c335b1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-1ebdefed-0903-4d72-b78d-912666c5ce61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 09:59:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:50.020 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:50.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.372 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.373 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1363363369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.839 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.896 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.897 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.900 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:50 np0005588919 nova_compute[225855]: 2026-01-20 14:59:50.900 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 09:59:51 np0005588919 podman[282670]: 2026-01-20 14:59:51.042544575 +0000 UTC m=+0.084968938 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.070 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.071 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3943MB free_disk=20.92159652709961GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.072 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.072 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.176 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 1ebdefed-0903-4d72-b78d-912666c5ce61 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.177 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance b4c1468d-9914-426a-9464-c1167de53632 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.177 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.177 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.238 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2238468495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.682 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.688 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:51.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.704 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.706 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.706 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:51 np0005588919 nova_compute[225855]: 2026-01-20 14:59:51.776 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.168 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:52.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.739 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.740 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.741 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.741 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.741 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.743 225859 INFO nova.compute.manager [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Terminating instance#033[00m
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.744 225859 DEBUG nova.compute.manager [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:59:52 np0005588919 kernel: tap7c572239-9b (unregistering): left promiscuous mode
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588919 NetworkManager[49104]: <info>  [1768921192.9029] device (tap7c572239-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.914 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:52Z|00573|binding|INFO|Releasing lport 7c572239-9b2e-493c-8be5-632f27cc634a from this chassis (sb_readonly=0)
Jan 20 09:59:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:52Z|00574|binding|INFO|Setting lport 7c572239-9b2e-493c-8be5-632f27cc634a down in Southbound
Jan 20 09:59:52 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:52Z|00575|binding|INFO|Removing iface tap7c572239-9b ovn-installed in OVS
Jan 20 09:59:52 np0005588919 nova_compute[225855]: 2026-01-20 14:59:52.931 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:52 np0005588919 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Jan 20 09:59:52 np0005588919 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008c.scope: Consumed 19.864s CPU time.
Jan 20 09:59:52 np0005588919 systemd-machined[194361]: Machine qemu-67-instance-0000008c terminated.
Jan 20 09:59:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:52.975 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:6a:1f 10.100.0.9'], port_security=['fa:16:3e:d9:6a:1f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b4c1468d-9914-426a-9464-c1167de53632', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7c572239-9b2e-493c-8be5-632f27cc634a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:52.976 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7c572239-9b2e-493c-8be5-632f27cc634a in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 unbound from our chassis#033[00m
Jan 20 09:59:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:52.977 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58d966e1-4d26-414a-920e-0be2d77abb59#033[00m
Jan 20 09:59:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:52.995 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[61dc4c9f-dd68-4779-9483-b81d4df41da8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.026 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[aaee87f1-5c69-4301-96fa-c636ec34d97a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.029 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f61595b8-b66d-4005-a9d2-7eb9bbcc6333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.063 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[29cf5371-cb52-49a4-aa4a-dc4147115994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.081 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[466a40d4-68e4-4bd1-80e8-122745cb153a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58d966e1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:c8:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610366, 'reachable_time': 35351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282731, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.098 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[93b192bd-bcfa-4629-a5c7-a73fb5eb3a20]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58d966e1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610378, 'tstamp': 610378}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282732, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58d966e1-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610381, 'tstamp': 610381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282732, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.100 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.106 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.107 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d966e1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.107 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.108 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58d966e1-40, col_values=(('external_ids', {'iface-id': '1623097d-35b0-4d71-9dc2-c4d659492102'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:53.108 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.178 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] Instance destroyed successfully.#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.178 225859 DEBUG nova.objects.instance [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'resources' on Instance uuid b4c1468d-9914-426a-9464-c1167de53632 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.192 225859 DEBUG nova.virt.libvirt.vif [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:57:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-65714861',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-65714861',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:57:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-8l7rw241',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:51Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=b4c1468d-9914-426a-9464-c1167de53632,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.193 225859 DEBUG nova.network.os_vif_util [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "7c572239-9b2e-493c-8be5-632f27cc634a", "address": "fa:16:3e:d9:6a:1f", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c572239-9b", "ovs_interfaceid": "7c572239-9b2e-493c-8be5-632f27cc634a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.194 225859 DEBUG nova.network.os_vif_util [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.195 225859 DEBUG os_vif [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.197 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.197 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c572239-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.199 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.201 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.205 225859 INFO os_vif [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:6a:1f,bridge_name='br-int',has_traffic_filtering=True,id=7c572239-9b2e-493c-8be5-632f27cc634a,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c572239-9b')#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.414 225859 INFO nova.virt.libvirt.driver [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Deleting instance files /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632_del#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.415 225859 INFO nova.virt.libvirt.driver [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Deletion of /var/lib/nova/instances/b4c1468d-9914-426a-9464-c1167de53632_del complete#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.527 225859 INFO nova.compute.manager [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.528 225859 DEBUG oslo.service.loopingcall [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.528 225859 DEBUG nova.compute.manager [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.528 225859 DEBUG nova.network.neutron [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:59:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:53.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.701 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.702 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.967 225859 DEBUG nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-unplugged-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.967 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.968 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.968 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.968 225859 DEBUG nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] No waiting events found dispatching network-vif-unplugged-7c572239-9b2e-493c-8be5-632f27cc634a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.968 225859 DEBUG nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-unplugged-7c572239-9b2e-493c-8be5-632f27cc634a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.969 225859 DEBUG nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.969 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c1468d-9914-426a-9464-c1167de53632-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.969 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.969 225859 DEBUG oslo_concurrency.lockutils [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.969 225859 DEBUG nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] No waiting events found dispatching network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:53 np0005588919 nova_compute[225855]: 2026-01-20 14:59:53.970 225859 WARNING nova.compute.manager [req-b7e5455a-e464-4068-9a1a-67557e9ff05c req-d88290a0-b8f0-4294-b214-02717eb37506 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received unexpected event network-vif-plugged-7c572239-9b2e-493c-8be5-632f27cc634a for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:59:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:54.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.292 225859 DEBUG nova.network.neutron [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.339 225859 INFO nova.compute.manager [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] Took 0.81 seconds to deallocate network for instance.#033[00m
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.569 225859 INFO nova.compute.manager [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Took 0.23 seconds to detach 1 volumes for instance.#033[00m
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.615 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.616 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:54 np0005588919 nova_compute[225855]: 2026-01-20 14:59:54.685 225859 DEBUG oslo_concurrency.processutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 09:59:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 09:59:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:55 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2354938243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.152 225859 DEBUG oslo_concurrency.processutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.159 225859 DEBUG nova.compute.provider_tree [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.668 225859 DEBUG nova.scheduler.client.report [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.693 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:55.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.762 225859 INFO nova.scheduler.client.report [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Deleted allocations for instance b4c1468d-9914-426a-9464-c1167de53632#033[00m
Jan 20 09:59:55 np0005588919 nova_compute[225855]: 2026-01-20 14:59:55.840 225859 DEBUG oslo_concurrency.lockutils [None req-f281c6c5-7161-4cc0-ba81-18ff8176d767 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "b4c1468d-9914-426a-9464-c1167de53632" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:56 np0005588919 nova_compute[225855]: 2026-01-20 14:59:56.070 225859 DEBUG nova.compute.manager [req-faa8b653-d449-4ee6-956f-51f6df16183e req-a0f2553b-215e-4709-9226-9fb94f0a4751 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c1468d-9914-426a-9464-c1167de53632] Received event network-vif-deleted-7c572239-9b2e-493c-8be5-632f27cc634a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:56.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:56 np0005588919 nova_compute[225855]: 2026-01-20 14:59:56.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.202 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.202 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.203 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.203 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.203 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.204 225859 INFO nova.compute.manager [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Terminating instance#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.205 225859 DEBUG nova.compute.manager [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 09:59:57 np0005588919 kernel: tap3067803c-07 (unregistering): left promiscuous mode
Jan 20 09:59:57 np0005588919 NetworkManager[49104]: <info>  [1768921197.2598] device (tap3067803c-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 09:59:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:57Z|00576|binding|INFO|Releasing lport 3067803c-07f3-4a15-a5ee-47f9a770efca from this chassis (sb_readonly=0)
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.268 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:57Z|00577|binding|INFO|Setting lport 3067803c-07f3-4a15-a5ee-47f9a770efca down in Southbound
Jan 20 09:59:57 np0005588919 ovn_controller[130490]: 2026-01-20T14:59:57Z|00578|binding|INFO|Removing iface tap3067803c-07 ovn-installed in OVS
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.276 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:b7:b1 10.100.0.10'], port_security=['fa:16:3e:cd:b7:b1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1ebdefed-0903-4d72-b78d-912666c5ce61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58d966e1-4d26-414a-920e-0be2d77abb59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '107c1f3b5b7b413d9a389ca1166e331f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '207accdf-2d5c-48e9-bf02-5dfcc7d28063', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a2edf59-0338-43ad-aa77-d6a806c781a6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=3067803c-07f3-4a15-a5ee-47f9a770efca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.278 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 3067803c-07f3-4a15-a5ee-47f9a770efca in datapath 58d966e1-4d26-414a-920e-0be2d77abb59 unbound from our chassis#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.280 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58d966e1-4d26-414a-920e-0be2d77abb59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.281 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac2f8ae-617e-4341-b8d3-8330e01847a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.282 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 namespace which is not needed anymore#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.286 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 20 09:59:57 np0005588919 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008b.scope: Consumed 19.019s CPU time.
Jan 20 09:59:57 np0005588919 systemd-machined[194361]: Machine qemu-66-instance-0000008b terminated.
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [NOTICE]   (280192) : haproxy version is 2.8.14-c23fe91
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [NOTICE]   (280192) : path to executable is /usr/sbin/haproxy
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [WARNING]  (280192) : Exiting Master process...
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [WARNING]  (280192) : Exiting Master process...
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [ALERT]    (280192) : Current worker (280194) exited with code 143 (Terminated)
Jan 20 09:59:57 np0005588919 neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59[280188]: [WARNING]  (280192) : All workers exited. Exiting... (0)
Jan 20 09:59:57 np0005588919 systemd[1]: libpod-f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc.scope: Deactivated successfully.
Jan 20 09:59:57 np0005588919 podman[283113]: 2026-01-20 14:59:57.427985928 +0000 UTC m=+0.050208738 container died f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.447 225859 INFO nova.virt.libvirt.driver [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Instance destroyed successfully.#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.448 225859 DEBUG nova.objects.instance [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lazy-loading 'resources' on Instance uuid 1ebdefed-0903-4d72-b78d-912666c5ce61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.465 225859 DEBUG nova.virt.libvirt.vif [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:57:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1983668831',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1983668831',id=139,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBbRGX2xZT3D1ftqdKpZwTwb/ukXbRv/O5UyYYLjii3gk46qsw4SNMi6p0GpNIY5l/f9OSIg9UlRsUFQqLszBoQT2vJic2iOBlI6VLyxyg71obcHOZQEGpjfcTfqUsJeQ==',key_name='tempest-TestInstancesWithCinderVolumes-1812188149',keypairs=<?>,launch_index=0,launched_at=2026-01-20T14:57:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='107c1f3b5b7b413d9a389ca1166e331f',ramdisk_id='',reservation_id='r-c4vqjrp4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-1174033615',owner_user_name='tempest-TestInstancesWithCinderVolumes-1174033615-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T14:57:45Z,user_data=None,user_id='ed2c9bd268d1491fa3484d86bcdb9ec6',uuid=1ebdefed-0903-4d72-b78d-912666c5ce61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.467 225859 DEBUG nova.network.os_vif_util [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converting VIF {"id": "3067803c-07f3-4a15-a5ee-47f9a770efca", "address": "fa:16:3e:cd:b7:b1", "network": {"id": "58d966e1-4d26-414a-920e-0be2d77abb59", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1896990059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "107c1f3b5b7b413d9a389ca1166e331f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3067803c-07", "ovs_interfaceid": "3067803c-07f3-4a15-a5ee-47f9a770efca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.467 225859 DEBUG nova.network.os_vif_util [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 09:59:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc-userdata-shm.mount: Deactivated successfully.
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.472 225859 DEBUG os_vif [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 09:59:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ad0bfac44a2e71e205ef5911174c9794e1609c1289dfb400ba4190926919b056-merged.mount: Deactivated successfully.
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.475 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3067803c-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.482 225859 INFO os_vif [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:b7:b1,bridge_name='br-int',has_traffic_filtering=True,id=3067803c-07f3-4a15-a5ee-47f9a770efca,network=Network(58d966e1-4d26-414a-920e-0be2d77abb59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3067803c-07')#033[00m
Jan 20 09:59:57 np0005588919 podman[283113]: 2026-01-20 14:59:57.487154897 +0000 UTC m=+0.109377687 container cleanup f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 09:59:57 np0005588919 systemd[1]: libpod-conmon-f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc.scope: Deactivated successfully.
Jan 20 09:59:57 np0005588919 podman[283167]: 2026-01-20 14:59:57.56241252 +0000 UTC m=+0.052285216 container remove f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.570 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a5738345-d644-414a-91a2-b15530ef5cee]: (4, ('Tue Jan 20 02:59:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 (f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc)\nf09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc\nTue Jan 20 02:59:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 (f09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc)\nf09eb4085a157e25ca55ec28f9deac3d8ef6af0026f2f7a138323503b9f81ddc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.572 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e35ad46c-9cb7-41a4-88a8-f847de456a30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.573 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d966e1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.574 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 kernel: tap58d966e1-40: left promiscuous mode
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.595 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.597 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78583c0a-b613-42f5-ae10-fcf720178eee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.612 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c885642f-1836-4dda-bc7b-3d6dbe1afd67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.614 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eae6b91d-e263-4bc3-bd77-2ea2616f8139]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.635 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[46a8cd80-fc1b-4a9c-8bc3-da38303b05ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610359, 'reachable_time': 17831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283186, 'error': None, 'target': 'ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.638 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58d966e1-4d26-414a-920e-0be2d77abb59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 09:59:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 14:59:57.639 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdfd5d1-456e-42d7-83d4-39a54b000ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 09:59:57 np0005588919 systemd[1]: run-netns-ovnmeta\x2d58d966e1\x2d4d26\x2d414a\x2d920e\x2d0be2d77abb59.mount: Deactivated successfully.
Jan 20 09:59:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 09:59:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:57.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.717 225859 INFO nova.virt.libvirt.driver [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Deleting instance files /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61_del#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.718 225859 INFO nova.virt.libvirt.driver [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Deletion of /var/lib/nova/instances/1ebdefed-0903-4d72-b78d-912666c5ce61_del complete#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.765 225859 INFO nova.compute.manager [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.766 225859 DEBUG oslo.service.loopingcall [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.766 225859 DEBUG nova.compute.manager [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 09:59:57 np0005588919 nova_compute[225855]: 2026-01-20 14:59:57.766 225859 DEBUG nova.network.neutron [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.177 225859 DEBUG nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-unplugged-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.177 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.177 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.177 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.178 225859 DEBUG nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] No waiting events found dispatching network-vif-unplugged-3067803c-07f3-4a15-a5ee-47f9a770efca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.178 225859 DEBUG nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-unplugged-3067803c-07f3-4a15-a5ee-47f9a770efca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.178 225859 DEBUG nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.178 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.179 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.179 225859 DEBUG oslo_concurrency.lockutils [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.179 225859 DEBUG nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] No waiting events found dispatching network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.179 225859 WARNING nova.compute.manager [req-3061d5b3-2d7e-440f-a12c-b9cb0028654a req-7afca30d-572e-4290-ba3d-72108b581c02 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received unexpected event network-vif-plugged-3067803c-07f3-4a15-a5ee-47f9a770efca for instance with vm_state active and task_state deleting.#033[00m
Jan 20 09:59:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:14:59:58.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 09:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 09:59:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.920 225859 DEBUG nova.network.neutron [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 09:59:58 np0005588919 nova_compute[225855]: 2026-01-20 14:59:58.944 225859 INFO nova.compute.manager [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.165 225859 INFO nova.compute.manager [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Took 0.22 seconds to detach 1 volumes for instance.#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.220 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.221 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.305 225859 DEBUG oslo_concurrency.processutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.460 225859 DEBUG nova.compute.manager [req-3d061aff-00c5-406c-a159-5683fff8ec28 req-0dede328-7eb2-474f-8907-5c1968c939c6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Received event network-vif-deleted-3067803c-07f3-4a15-a5ee-47f9a770efca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 09:59:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 09:59:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 09:59:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:14:59:59.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 09:59:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 09:59:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1717918447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.766 225859 DEBUG oslo_concurrency.processutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.773 225859 DEBUG nova.compute.provider_tree [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.789 225859 DEBUG nova.scheduler.client.report [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.810 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.853 225859 INFO nova.scheduler.client.report [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Deleted allocations for instance 1ebdefed-0903-4d72-b78d-912666c5ce61#033[00m
Jan 20 09:59:59 np0005588919 nova_compute[225855]: 2026-01-20 14:59:59.933 225859 DEBUG oslo_concurrency.lockutils [None req-ef91f40a-4b99-482f-a429-c9ca6ccd5151 ed2c9bd268d1491fa3484d86bcdb9ec6 107c1f3b5b7b413d9a389ca1166e331f - - default default] Lock "1ebdefed-0903-4d72-b78d-912666c5ce61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:00.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:00:00 np0005588919 nova_compute[225855]: 2026-01-20 15:00:00.939 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:00 np0005588919 nova_compute[225855]: 2026-01-20 15:00:00.940 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:00 np0005588919 nova_compute[225855]: 2026-01-20 15:00:00.956 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.058 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.059 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.066 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.067 225859 INFO nova.compute.claims [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.209 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3486972496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.677 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.684 225859 DEBUG nova.compute.provider_tree [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:01.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.709 225859 DEBUG nova.scheduler.client.report [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.735 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.736 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.779 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.785 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.786 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.820 225859 INFO nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.842 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.930 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.932 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.932 225859 INFO nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Creating image(s)#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.959 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:01 np0005588919 nova_compute[225855]: 2026-01-20 15:00:01.990 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.021 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.026 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.053 225859 DEBUG nova.policy [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a3fc2ba2a08423eb2e0bd7cf0fd5cf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '994b02a8c0094d2daa7b775b1f86f394', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.096 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.097 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.097 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.098 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.125 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.130 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f1757bed-1718-45e4-a731-11f1a3b4f068_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:02.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.379 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 f1757bed-1718-45e4-a731-11f1a3b4f068_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.457 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] resizing rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.488 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.564 225859 DEBUG nova.objects.instance [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lazy-loading 'migration_context' on Instance uuid f1757bed-1718-45e4-a731-11f1a3b4f068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.602 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.602 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Ensure instance console log exists: /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.603 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.603 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:02 np0005588919 nova_compute[225855]: 2026-01-20 15:00:02.603 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:03 np0005588919 podman[283401]: 2026-01-20 15:00:03.006793484 +0000 UTC m=+0.054297443 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:00:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:00:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:00:03 np0005588919 nova_compute[225855]: 2026-01-20 15:00:03.527 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Successfully created port: f62f622f-1d0a-4a68-9540-d1a7f48a66d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:00:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:03.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:04.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:00:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3429916275' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:00:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:00:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3429916275' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.017052) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205017130, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1918, "num_deletes": 259, "total_data_size": 4384204, "memory_usage": 4429136, "flush_reason": "Manual Compaction"}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205039284, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2830046, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 52522, "largest_seqno": 54435, "table_properties": {"data_size": 2822148, "index_size": 4648, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17420, "raw_average_key_size": 20, "raw_value_size": 2805888, "raw_average_value_size": 3277, "num_data_blocks": 203, "num_entries": 856, "num_filter_entries": 856, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921060, "oldest_key_time": 1768921060, "file_creation_time": 1768921205, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 22271 microseconds, and 6384 cpu microseconds.
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.039345) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2830046 bytes OK
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.039363) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.042709) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.042725) EVENT_LOG_v1 {"time_micros": 1768921205042720, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.042744) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 4375403, prev total WAL file size 4375403, number of live WAL files 2.
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.043736) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373634' seq:72057594037927935, type:22 .. '6C6F676D0032303138' seq:0, type:0; will stop at (end)
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2763KB)], [102(10108KB)]
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205043815, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13181361, "oldest_snapshot_seqno": -1}
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.105 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Successfully updated port: f62f622f-1d0a-4a68-9540-d1a7f48a66d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.121 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.121 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquired lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.122 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 8095 keys, 13023021 bytes, temperature: kUnknown
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205164068, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 13023021, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12967333, "index_size": 34328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20293, "raw_key_size": 209010, "raw_average_key_size": 25, "raw_value_size": 12821522, "raw_average_value_size": 1583, "num_data_blocks": 1357, "num_entries": 8095, "num_filter_entries": 8095, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921205, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.164314) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 13023021 bytes
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.166394) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.5 rd, 108.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 9.9 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(9.3) write-amplify(4.6) OK, records in: 8635, records dropped: 540 output_compression: NoCompression
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.166440) EVENT_LOG_v1 {"time_micros": 1768921205166424, "job": 64, "event": "compaction_finished", "compaction_time_micros": 120328, "compaction_time_cpu_micros": 43342, "output_level": 6, "num_output_files": 1, "total_output_size": 13023021, "num_input_records": 8635, "num_output_records": 8095, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205167129, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921205169009, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.043590) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.169098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.169104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.169106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.169107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:05.169109) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.235 225859 DEBUG nova.compute.manager [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-changed-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.235 225859 DEBUG nova.compute.manager [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Refreshing instance network info cache due to event network-changed-f62f622f-1d0a-4a68-9540-d1a7f48a66d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.236 225859 DEBUG oslo_concurrency.lockutils [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.469 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:00:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:05.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:05 np0005588919 nova_compute[225855]: 2026-01-20 15:00:05.995 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:06.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.313 225859 DEBUG nova.network.neutron [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updating instance_info_cache with network_info: [{"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.320 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.340 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Releasing lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.341 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Instance network_info: |[{"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.341 225859 DEBUG oslo_concurrency.lockutils [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.342 225859 DEBUG nova.network.neutron [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Refreshing network info cache for port f62f622f-1d0a-4a68-9540-d1a7f48a66d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.345 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Start _get_guest_xml network_info=[{"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.349 225859 WARNING nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.356 225859 DEBUG nova.virt.libvirt.host [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.356 225859 DEBUG nova.virt.libvirt.host [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.361 225859 DEBUG nova.virt.libvirt.host [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.362 225859 DEBUG nova.virt.libvirt.host [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.363 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.363 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.364 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.364 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.364 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.365 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.365 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.365 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.365 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.366 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.366 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.366 225859 DEBUG nova.virt.hardware [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.369 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.781 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1652995315' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.809 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.834 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:06 np0005588919 nova_compute[225855]: 2026-01-20 15:00:06.838 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2825997578' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:00:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2825997578' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:00:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/407955871' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.287 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.289 225859 DEBUG nova.virt.libvirt.vif [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2075964816',display_name='tempest-TestServerBasicOps-server-2075964816',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2075964816',id=146,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCiCeGBNIIjQtIa5Udw9T4nKiS3sQpmIFto+Zj/ppiwHl3KoPC1ZwXSQfteIxtI2AuErtkRwyRat7WVpBCL4SK6jCl43k4+LHYwocVMfWmtSf2fkMge6nUPK98YTBKBV9g==',key_name='tempest-TestServerBasicOps-469960215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='994b02a8c0094d2daa7b775b1f86f394',ramdisk_id='',reservation_id='r-c012j9n0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-2006244970',owner_user_name='tempest-TestServerBasicOps-2006244970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a3fc2ba2a08423eb2e0bd7cf0fd5cf7',uuid=f1757bed-1718-45e4-a731-11f1a3b4f068,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.290 225859 DEBUG nova.network.os_vif_util [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converting VIF {"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.291 225859 DEBUG nova.network.os_vif_util [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.292 225859 DEBUG nova.objects.instance [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1757bed-1718-45e4-a731-11f1a3b4f068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.393 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <uuid>f1757bed-1718-45e4-a731-11f1a3b4f068</uuid>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <name>instance-00000092</name>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestServerBasicOps-server-2075964816</nova:name>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:00:06</nova:creationTime>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:user uuid="2a3fc2ba2a08423eb2e0bd7cf0fd5cf7">tempest-TestServerBasicOps-2006244970-project-member</nova:user>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:project uuid="994b02a8c0094d2daa7b775b1f86f394">tempest-TestServerBasicOps-2006244970</nova:project>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <nova:port uuid="f62f622f-1d0a-4a68-9540-d1a7f48a66d0">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="serial">f1757bed-1718-45e4-a731-11f1a3b4f068</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="uuid">f1757bed-1718-45e4-a731-11f1a3b4f068</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f1757bed-1718-45e4-a731-11f1a3b4f068_disk">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:cb:97:d3"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <target dev="tapf62f622f-1d"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/console.log" append="off"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:00:07 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:00:07 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:00:07 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:00:07 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.395 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Preparing to wait for external event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.395 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.396 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.396 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.397 225859 DEBUG nova.virt.libvirt.vif [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2075964816',display_name='tempest-TestServerBasicOps-server-2075964816',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2075964816',id=146,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCiCeGBNIIjQtIa5Udw9T4nKiS3sQpmIFto+Zj/ppiwHl3KoPC1ZwXSQfteIxtI2AuErtkRwyRat7WVpBCL4SK6jCl43k4+LHYwocVMfWmtSf2fkMge6nUPK98YTBKBV9g==',key_name='tempest-TestServerBasicOps-469960215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='994b02a8c0094d2daa7b775b1f86f394',ramdisk_id='',reservation_id='r-c012j9n0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-2006244970',owner_user_name='tempest-TestServerBasicOps-2006244970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a3fc2ba2a08423eb2e0bd7cf0fd5cf7',uuid=f1757bed-1718-45e4-a731-11f1a3b4f068,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.397 225859 DEBUG nova.network.os_vif_util [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converting VIF {"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.397 225859 DEBUG nova.network.os_vif_util [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.398 225859 DEBUG os_vif [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.398 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.399 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.399 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.402 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf62f622f-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.403 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf62f622f-1d, col_values=(('external_ids', {'iface-id': 'f62f622f-1d0a-4a68-9540-d1a7f48a66d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:97:d3', 'vm-uuid': 'f1757bed-1718-45e4-a731-11f1a3b4f068'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.404 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:07 np0005588919 NetworkManager[49104]: <info>  [1768921207.4052] manager: (tapf62f622f-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.406 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.410 225859 INFO os_vif [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d')#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.464 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.465 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.465 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] No VIF found with MAC fa:16:3e:cb:97:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.465 225859 INFO nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Using config drive#033[00m
Jan 20 10:00:07 np0005588919 nova_compute[225855]: 2026-01-20 15:00:07.492 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:07.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:08 np0005588919 nova_compute[225855]: 2026-01-20 15:00:08.177 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921193.175826, b4c1468d-9914-426a-9464-c1167de53632 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:08 np0005588919 nova_compute[225855]: 2026-01-20 15:00:08.177 225859 INFO nova.compute.manager [-] [instance: b4c1468d-9914-426a-9464-c1167de53632] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:08 np0005588919 nova_compute[225855]: 2026-01-20 15:00:08.205 225859 DEBUG nova.compute.manager [None req-bf9d1724-5477-49ba-badc-82a806421648 - - - - - -] [instance: b4c1468d-9914-426a-9464-c1167de53632] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:08.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:08 np0005588919 nova_compute[225855]: 2026-01-20 15:00:08.964 225859 INFO nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Creating config drive at /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config#033[00m
Jan 20 10:00:08 np0005588919 nova_compute[225855]: 2026-01-20 15:00:08.971 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmq204j3q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.105 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmq204j3q" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.134 225859 DEBUG nova.storage.rbd_utils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] rbd image f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.139 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.167 225859 DEBUG nova.network.neutron [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updated VIF entry in instance network info cache for port f62f622f-1d0a-4a68-9540-d1a7f48a66d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.168 225859 DEBUG nova.network.neutron [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updating instance_info_cache with network_info: [{"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.195 225859 DEBUG oslo_concurrency.lockutils [req-eb198a38-ff06-4e43-bacb-710a317b1b28 req-77adfd5a-ef37-4760-ab9d-e4706e74e755 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.305 225859 DEBUG oslo_concurrency.processutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config f1757bed-1718-45e4-a731-11f1a3b4f068_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.306 225859 INFO nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Deleting local config drive /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068/disk.config because it was imported into RBD.#033[00m
Jan 20 10:00:09 np0005588919 kernel: tapf62f622f-1d: entered promiscuous mode
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.3555] manager: (tapf62f622f-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.356 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:09Z|00579|binding|INFO|Claiming lport f62f622f-1d0a-4a68-9540-d1a7f48a66d0 for this chassis.
Jan 20 10:00:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:09Z|00580|binding|INFO|f62f622f-1d0a-4a68-9540-d1a7f48a66d0: Claiming fa:16:3e:cb:97:d3 10.100.0.13
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.361 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.368 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:97:d3 10.100.0.13'], port_security=['fa:16:3e:cb:97:d3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f1757bed-1718-45e4-a731-11f1a3b4f068', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '994b02a8c0094d2daa7b775b1f86f394', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6cdfb95a-0d3a-472b-8deb-06068f9edf9a c3b2b511-7001-4a00-abcb-ee7970518e80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34e153e6-2244-443e-a2b3-4d18f8409d44, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f62f622f-1d0a-4a68-9540-d1a7f48a66d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.369 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f62f622f-1d0a-4a68-9540-d1a7f48a66d0 in datapath 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 bound to our chassis#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.370 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.380 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[22690c3c-8831-4d5b-b597-17e5ce6250e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.381 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap347be9eb-21 in ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.383 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap347be9eb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.383 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c506447a-6b64-47c9-ade0-1f272883e060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.383 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1eb5127c-52c7-466e-ab6b-a4260ab750b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.394 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ff75f058-a56b-4cf2-a50a-5893033d7a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 systemd-udevd[283611]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:00:09 np0005588919 systemd-machined[194361]: New machine qemu-68-instance-00000092.
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.4129] device (tapf62f622f-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.4137] device (tapf62f622f-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:00:09 np0005588919 systemd[1]: Started Virtual Machine qemu-68-instance-00000092.
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.422 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e048b386-0650-489c-bf84-31e4a06eee2c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.427 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:09Z|00581|binding|INFO|Setting lport f62f622f-1d0a-4a68-9540-d1a7f48a66d0 ovn-installed in OVS
Jan 20 10:00:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:09Z|00582|binding|INFO|Setting lport f62f622f-1d0a-4a68-9540-d1a7f48a66d0 up in Southbound
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.450 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[b014a32a-39c2-41a2-b0b9-894c14d59861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.455 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2565c99d-bebe-4656-aa25-021ecb90076b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.4571] manager: (tap347be9eb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Jan 20 10:00:09 np0005588919 systemd-udevd[283615]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.489 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ede41b79-2134-4c96-b69c-4b0aa498e963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.492 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[01e330ff-29d8-475e-ab81-82e03392e5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.5152] device (tap347be9eb-20): carrier: link connected
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.519 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6686cf-9bbd-4e25-8464-211df3fa66dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.537 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[96d6ccc8-871c-4f6e-b698-cf013c2c6574]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap347be9eb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:7f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624844, 'reachable_time': 18143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283643, 'error': None, 'target': 'ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.553 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2c731e-e364-4d95-93bc-b9d5edffde38]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8a:7f17'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 624844, 'tstamp': 624844}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283644, 'error': None, 'target': 'ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.571 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[55105227-023b-498b-a0db-5e01aea412e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap347be9eb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:7f:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624844, 'reachable_time': 18143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283645, 'error': None, 'target': 'ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.605 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0c21eb-b3c2-44a0-9279-22ae18ee6be3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.669 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[18665a3f-6c01-42ba-97a0-8d6e2fb8abbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.671 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap347be9eb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.671 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.671 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap347be9eb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.673 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 NetworkManager[49104]: <info>  [1768921209.6745] manager: (tap347be9eb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 20 10:00:09 np0005588919 kernel: tap347be9eb-20: entered promiscuous mode
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.677 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap347be9eb-20, col_values=(('external_ids', {'iface-id': '22c381cf-5510-43a8-bc2e-16e5bc5ac409'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.678 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:09Z|00583|binding|INFO|Releasing lport 22c381cf-5510-43a8-bc2e-16e5bc5ac409 from this chassis (sb_readonly=0)
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.695 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.696 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/347be9eb-2d7b-4b2d-b1e8-8ed5a063f269.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/347be9eb-2d7b-4b2d-b1e8-8ed5a063f269.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.697 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[70204339-f03a-4adc-8c0b-5b6245d27d14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.698 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/347be9eb-2d7b-4b2d-b1e8-8ed5a063f269.pid.haproxy
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:00:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:09.698 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'env', 'PROCESS_TAG=haproxy-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/347be9eb-2d7b-4b2d-b1e8-8ed5a063f269.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:00:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:09.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.740 225859 DEBUG nova.compute.manager [req-cdd13de9-6c41-45a6-9770-5c2d74c12cc4 req-76ada829-2b14-4f79-834c-ed1e9b2a7d54 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.741 225859 DEBUG oslo_concurrency.lockutils [req-cdd13de9-6c41-45a6-9770-5c2d74c12cc4 req-76ada829-2b14-4f79-834c-ed1e9b2a7d54 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.741 225859 DEBUG oslo_concurrency.lockutils [req-cdd13de9-6c41-45a6-9770-5c2d74c12cc4 req-76ada829-2b14-4f79-834c-ed1e9b2a7d54 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.741 225859 DEBUG oslo_concurrency.lockutils [req-cdd13de9-6c41-45a6-9770-5c2d74c12cc4 req-76ada829-2b14-4f79-834c-ed1e9b2a7d54 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.741 225859 DEBUG nova.compute.manager [req-cdd13de9-6c41-45a6-9770-5c2d74c12cc4 req-76ada829-2b14-4f79-834c-ed1e9b2a7d54 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Processing event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:00:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.871 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.871 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921209.8703406, f1757bed-1718-45e4-a731-11f1a3b4f068 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.872 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] VM Started (Lifecycle Event)#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.883 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.886 225859 INFO nova.virt.libvirt.driver [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Instance spawned successfully.#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.887 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.890 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.894 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.902 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.903 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.903 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.903 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.904 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.904 225859 DEBUG nova.virt.libvirt.driver [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.909 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.910 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921209.8776488, f1757bed-1718-45e4-a731-11f1a3b4f068 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.910 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.936 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.939 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921209.882652, f1757bed-1718-45e4-a731-11f1a3b4f068 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.939 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.958 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.962 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.967 225859 INFO nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Took 8.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:00:09 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.968 225859 DEBUG nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:10 np0005588919 nova_compute[225855]: 2026-01-20 15:00:09.999 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:10 np0005588919 nova_compute[225855]: 2026-01-20 15:00:10.043 225859 INFO nova.compute.manager [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Took 9.01 seconds to build instance.#033[00m
Jan 20 10:00:10 np0005588919 nova_compute[225855]: 2026-01-20 15:00:10.065 225859 DEBUG oslo_concurrency.lockutils [None req-bcfce5a4-4ed1-4cf4-a1c3-2572368c4e32 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:10 np0005588919 podman[283720]: 2026-01-20 15:00:10.09339982 +0000 UTC m=+0.059687095 container create 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 10:00:10 np0005588919 systemd[1]: Started libpod-conmon-62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c.scope.
Jan 20 10:00:10 np0005588919 podman[283720]: 2026-01-20 15:00:10.058598478 +0000 UTC m=+0.024885773 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:00:10 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:00:10 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4b2c64753b02dd1a3ea792cfd3a0a537701996b7974d1a901afebdbb9cc4a05/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:00:10 np0005588919 podman[283720]: 2026-01-20 15:00:10.18061396 +0000 UTC m=+0.146901255 container init 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:00:10 np0005588919 podman[283720]: 2026-01-20 15:00:10.186842866 +0000 UTC m=+0.153130141 container start 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:00:10 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [NOTICE]   (283739) : New worker (283741) forked
Jan 20 10:00:10 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [NOTICE]   (283739) : Loading success.
Jan 20 10:00:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:10.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:11.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:11 np0005588919 nova_compute[225855]: 2026-01-20 15:00:11.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:12.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.404 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.445 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921197.443742, 1ebdefed-0903-4d72-b78d-912666c5ce61 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.445 225859 INFO nova.compute.manager [-] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.743 225859 DEBUG nova.compute.manager [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.744 225859 DEBUG oslo_concurrency.lockutils [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.744 225859 DEBUG oslo_concurrency.lockutils [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.745 225859 DEBUG oslo_concurrency.lockutils [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.745 225859 DEBUG nova.compute.manager [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] No waiting events found dispatching network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.745 225859 WARNING nova.compute.manager [req-0baa5c00-22bb-4a05-bb49-390b91d239c0 req-15eb2abd-93aa-4d8c-8ac4-1b40c998950b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received unexpected event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:00:12 np0005588919 nova_compute[225855]: 2026-01-20 15:00:12.799 225859 DEBUG nova.compute.manager [None req-5e119f5d-7f1c-41d0-8f91-7df112143d2c - - - - - -] [instance: 1ebdefed-0903-4d72-b78d-912666c5ce61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:13.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:14.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:14 np0005588919 nova_compute[225855]: 2026-01-20 15:00:14.522 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:14 np0005588919 NetworkManager[49104]: <info>  [1768921214.5260] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 20 10:00:14 np0005588919 NetworkManager[49104]: <info>  [1768921214.5274] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 20 10:00:14 np0005588919 nova_compute[225855]: 2026-01-20 15:00:14.715 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:14 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:14Z|00584|binding|INFO|Releasing lport 22c381cf-5510-43a8-bc2e-16e5bc5ac409 from this chassis (sb_readonly=0)
Jan 20 10:00:14 np0005588919 nova_compute[225855]: 2026-01-20 15:00:14.736 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:15 np0005588919 nova_compute[225855]: 2026-01-20 15:00:15.041 225859 DEBUG nova.compute.manager [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-changed-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:15 np0005588919 nova_compute[225855]: 2026-01-20 15:00:15.041 225859 DEBUG nova.compute.manager [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Refreshing instance network info cache due to event network-changed-f62f622f-1d0a-4a68-9540-d1a7f48a66d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:00:15 np0005588919 nova_compute[225855]: 2026-01-20 15:00:15.041 225859 DEBUG oslo_concurrency.lockutils [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:15 np0005588919 nova_compute[225855]: 2026-01-20 15:00:15.042 225859 DEBUG oslo_concurrency.lockutils [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:15 np0005588919 nova_compute[225855]: 2026-01-20 15:00:15.042 225859 DEBUG nova.network.neutron [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Refreshing network info cache for port f62f622f-1d0a-4a68-9540-d1a7f48a66d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:00:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 20 10:00:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:15.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:16.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:16Z|00585|binding|INFO|Releasing lport 22c381cf-5510-43a8-bc2e-16e5bc5ac409 from this chassis (sb_readonly=0)
Jan 20 10:00:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:16.421 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:16.421 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:16.422 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:16 np0005588919 nova_compute[225855]: 2026-01-20 15:00:16.452 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:16 np0005588919 nova_compute[225855]: 2026-01-20 15:00:16.606 225859 DEBUG nova.network.neutron [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updated VIF entry in instance network info cache for port f62f622f-1d0a-4a68-9540-d1a7f48a66d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:00:16 np0005588919 nova_compute[225855]: 2026-01-20 15:00:16.607 225859 DEBUG nova.network.neutron [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updating instance_info_cache with network_info: [{"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:16 np0005588919 nova_compute[225855]: 2026-01-20 15:00:16.629 225859 DEBUG oslo_concurrency.lockutils [req-1895007c-e922-4783-ba2f-19daac2a4a1c req-05ddc67f-95bd-499b-bca5-b9e77a62ade5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f1757bed-1718-45e4-a731-11f1a3b4f068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:16 np0005588919 nova_compute[225855]: 2026-01-20 15:00:16.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:17 np0005588919 nova_compute[225855]: 2026-01-20 15:00:17.405 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:17.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:17 np0005588919 nova_compute[225855]: 2026-01-20 15:00:17.980 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:18.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:19.084 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:19.085 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:00:19 np0005588919 nova_compute[225855]: 2026-01-20 15:00:19.087 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:19.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:20.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:21.087 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:21.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:21 np0005588919 nova_compute[225855]: 2026-01-20 15:00:21.788 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:22 np0005588919 podman[283808]: 2026-01-20 15:00:22.070167346 +0000 UTC m=+0.095773583 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:00:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:22.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:22 np0005588919 nova_compute[225855]: 2026-01-20 15:00:22.407 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:22Z|00586|binding|INFO|Releasing lport 22c381cf-5510-43a8-bc2e-16e5bc5ac409 from this chassis (sb_readonly=0)
Jan 20 10:00:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:22 np0005588919 nova_compute[225855]: 2026-01-20 15:00:22.670 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:23 np0005588919 nova_compute[225855]: 2026-01-20 15:00:23.399 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:23.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:23Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:97:d3 10.100.0.13
Jan 20 10:00:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:23Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:97:d3 10.100.0.13
Jan 20 10:00:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:24.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:25.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:26.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:26 np0005588919 nova_compute[225855]: 2026-01-20 15:00:26.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:27 np0005588919 nova_compute[225855]: 2026-01-20 15:00:27.409 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:27.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:28.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:29.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:30.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:31.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:31 np0005588919 nova_compute[225855]: 2026-01-20 15:00:31.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:32.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:32 np0005588919 nova_compute[225855]: 2026-01-20 15:00:32.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:32.944 140461 DEBUG eventlet.wsgi.server [-] (140461) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:32.946 140461 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: Accept: */*#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: Connection: close#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: Content-Type: text/plain#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: Host: 169.254.169.254#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: User-Agent: curl/7.84.0#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: X-Forwarded-For: 10.100.0.13#015
Jan 20 10:00:32 np0005588919 ovn_metadata_agent[140349]: X-Ovn-Network-Id: 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.431 140461 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.432 140461 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 0.4856715#033[00m
Jan 20 10:00:33 np0005588919 haproxy-metadata-proxy-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283741]: 10.100.0.13:48962 [20/Jan/2026:15:00:32.943] listener listener/metadata 0/0/0/488/488 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.562 140461 DEBUG eventlet.wsgi.server [-] (140461) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.563 140461 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: Accept: */*#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: Connection: close#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: Content-Length: 100#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: Content-Type: application/x-www-form-urlencoded#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: Host: 169.254.169.254#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: User-Agent: curl/7.84.0#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: X-Forwarded-For: 10.100.0.13#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: X-Ovn-Network-Id: 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269#015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: #015
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.734 140461 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 20 10:00:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:33.735 140461 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.1713569#033[00m
Jan 20 10:00:33 np0005588919 haproxy-metadata-proxy-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283741]: 10.100.0.13:60928 [20/Jan/2026:15:00:33.562] listener listener/metadata 0/0/0/172/172 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Jan 20 10:00:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:33.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:34 np0005588919 podman[283840]: 2026-01-20 15:00:34.018333674 +0000 UTC m=+0.060788136 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 10:00:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:34.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.712 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.713 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.713 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.714 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.714 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.715 225859 INFO nova.compute.manager [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Terminating instance#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.716 225859 DEBUG nova.compute.manager [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:00:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:35.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:35 np0005588919 kernel: tapf62f622f-1d (unregistering): left promiscuous mode
Jan 20 10:00:35 np0005588919 NetworkManager[49104]: <info>  [1768921235.7785] device (tapf62f622f-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:00:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:35Z|00587|binding|INFO|Releasing lport f62f622f-1d0a-4a68-9540-d1a7f48a66d0 from this chassis (sb_readonly=0)
Jan 20 10:00:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:35Z|00588|binding|INFO|Setting lport f62f622f-1d0a-4a68-9540-d1a7f48a66d0 down in Southbound
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:35Z|00589|binding|INFO|Removing iface tapf62f622f-1d ovn-installed in OVS
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.791 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:35.795 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:97:d3 10.100.0.13'], port_security=['fa:16:3e:cb:97:d3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f1757bed-1718-45e4-a731-11f1a3b4f068', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '994b02a8c0094d2daa7b775b1f86f394', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6cdfb95a-0d3a-472b-8deb-06068f9edf9a c3b2b511-7001-4a00-abcb-ee7970518e80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34e153e6-2244-443e-a2b3-4d18f8409d44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f62f622f-1d0a-4a68-9540-d1a7f48a66d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:35.796 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f62f622f-1d0a-4a68-9540-d1a7f48a66d0 in datapath 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 unbound from our chassis#033[00m
Jan 20 10:00:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:35.798 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:00:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:35.799 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c2bbb8-5f8c-4cfd-ab75-2dece8688209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:35.799 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 namespace which is not needed anymore#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588919 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 20 10:00:35 np0005588919 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Consumed 15.091s CPU time.
Jan 20 10:00:35 np0005588919 systemd-machined[194361]: Machine qemu-68-instance-00000092 terminated.
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.935 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.940 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:35 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [NOTICE]   (283739) : haproxy version is 2.8.14-c23fe91
Jan 20 10:00:35 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [NOTICE]   (283739) : path to executable is /usr/sbin/haproxy
Jan 20 10:00:35 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [WARNING]  (283739) : Exiting Master process...
Jan 20 10:00:35 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [ALERT]    (283739) : Current worker (283741) exited with code 143 (Terminated)
Jan 20 10:00:35 np0005588919 neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269[283735]: [WARNING]  (283739) : All workers exited. Exiting... (0)
Jan 20 10:00:35 np0005588919 systemd[1]: libpod-62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c.scope: Deactivated successfully.
Jan 20 10:00:35 np0005588919 podman[283934]: 2026-01-20 15:00:35.956623226 +0000 UTC m=+0.060924290 container died 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.958 225859 INFO nova.virt.libvirt.driver [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Instance destroyed successfully.#033[00m
Jan 20 10:00:35 np0005588919 nova_compute[225855]: 2026-01-20 15:00:35.959 225859 DEBUG nova.objects.instance [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lazy-loading 'resources' on Instance uuid f1757bed-1718-45e4-a731-11f1a3b4f068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c-userdata-shm.mount: Deactivated successfully.
Jan 20 10:00:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay-f4b2c64753b02dd1a3ea792cfd3a0a537701996b7974d1a901afebdbb9cc4a05-merged.mount: Deactivated successfully.
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.011 225859 DEBUG nova.virt.libvirt.vif [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T14:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-2075964816',display_name='tempest-TestServerBasicOps-server-2075964816',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-2075964816',id=146,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCiCeGBNIIjQtIa5Udw9T4nKiS3sQpmIFto+Zj/ppiwHl3KoPC1ZwXSQfteIxtI2AuErtkRwyRat7WVpBCL4SK6jCl43k4+LHYwocVMfWmtSf2fkMge6nUPK98YTBKBV9g==',key_name='tempest-TestServerBasicOps-469960215',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:00:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='994b02a8c0094d2daa7b775b1f86f394',ramdisk_id='',reservation_id='r-c012j9n0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-2006244970',owner_user_name='tempest-TestServerBasicOps-2006244970-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:00:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2a3fc2ba2a08423eb2e0bd7cf0fd5cf7',uuid=f1757bed-1718-45e4-a731-11f1a3b4f068,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.012 225859 DEBUG nova.network.os_vif_util [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converting VIF {"id": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "address": "fa:16:3e:cb:97:d3", "network": {"id": "347be9eb-2d7b-4b2d-b1e8-8ed5a063f269", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1057973257-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "994b02a8c0094d2daa7b775b1f86f394", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf62f622f-1d", "ovs_interfaceid": "f62f622f-1d0a-4a68-9540-d1a7f48a66d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.014 225859 DEBUG nova.network.os_vif_util [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.014 225859 DEBUG os_vif [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.017 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.017 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf62f622f-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588919 podman[283934]: 2026-01-20 15:00:36.025077197 +0000 UTC m=+0.129378221 container cleanup 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.025 225859 INFO os_vif [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:97:d3,bridge_name='br-int',has_traffic_filtering=True,id=f62f622f-1d0a-4a68-9540-d1a7f48a66d0,network=Network(347be9eb-2d7b-4b2d-b1e8-8ed5a063f269),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf62f622f-1d')#033[00m
Jan 20 10:00:36 np0005588919 systemd[1]: libpod-conmon-62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c.scope: Deactivated successfully.
Jan 20 10:00:36 np0005588919 podman[283980]: 2026-01-20 15:00:36.095603627 +0000 UTC m=+0.047236534 container remove 62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.102 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4f35a6-71bf-4557-8a4a-dbf9407b6a26]: (4, ('Tue Jan 20 03:00:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 (62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c)\n62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c\nTue Jan 20 03:00:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 (62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c)\n62f09e6d34cd4e3fc9344987f57c02ddc6251d2f2f53944064b170b013df8b8c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.104 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb24972-bca1-4618-913d-40183a4dcf83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.105 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap347be9eb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.107 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588919 kernel: tap347be9eb-20: left promiscuous mode
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.124 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1306bd-6e94-448d-9cf0-a00a079e4b05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.140 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c08823-abb7-4c69-b1c3-acc039a94c1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.142 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c1204772-b20a-45f6-a007-b4d97d81f4fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.159 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[55ea87be-b97d-4422-be04-295c757196c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 624837, 'reachable_time': 42436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284004, 'error': None, 'target': 'ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.162 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-347be9eb-2d7b-4b2d-b1e8-8ed5a063f269 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:00:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:36.163 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[0a10c0f8-f60c-4a29-aa5b-430c9330b316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:36 np0005588919 systemd[1]: run-netns-ovnmeta\x2d347be9eb\x2d2d7b\x2d4b2d\x2db1e8\x2d8ed5a063f269.mount: Deactivated successfully.
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.178 225859 DEBUG nova.compute.manager [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-unplugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.179 225859 DEBUG oslo_concurrency.lockutils [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.179 225859 DEBUG oslo_concurrency.lockutils [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.180 225859 DEBUG oslo_concurrency.lockutils [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.180 225859 DEBUG nova.compute.manager [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] No waiting events found dispatching network-vif-unplugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.180 225859 DEBUG nova.compute.manager [req-fa54498c-5459-43d1-b64f-943bd2ba939e req-90679d2e-6ed8-4b9e-be4e-d6a68542b000 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-unplugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:00:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:36.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.577 225859 INFO nova.virt.libvirt.driver [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Deleting instance files /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068_del#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.578 225859 INFO nova.virt.libvirt.driver [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Deletion of /var/lib/nova/instances/f1757bed-1718-45e4-a731-11f1a3b4f068_del complete#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.637 225859 INFO nova.compute.manager [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.638 225859 DEBUG oslo.service.loopingcall [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.638 225859 DEBUG nova.compute.manager [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.638 225859 DEBUG nova.network.neutron [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:00:36 np0005588919 nova_compute[225855]: 2026-01-20 15:00:36.794 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:37.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:38.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.512 225859 DEBUG nova.network.neutron [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.556 225859 INFO nova.compute.manager [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Took 1.92 seconds to deallocate network for instance.#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.628 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.629 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.839 225859 DEBUG oslo_concurrency.processutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.874 225859 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.874 225859 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.874 225859 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.875 225859 DEBUG oslo_concurrency.lockutils [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.875 225859 DEBUG nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] No waiting events found dispatching network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:38 np0005588919 nova_compute[225855]: 2026-01-20 15:00:38.875 225859 WARNING nova.compute.manager [req-522d8237-f10e-411b-b0c3-a305cee240f4 req-772d1398-e054-42b9-8153-4bf31d12d75c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received unexpected event network-vif-plugged-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:00:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/842019597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.303 225859 DEBUG oslo_concurrency.processutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.309 225859 DEBUG nova.compute.provider_tree [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.326 225859 DEBUG nova.scheduler.client.report [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.363 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.412 225859 INFO nova.scheduler.client.report [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Deleted allocations for instance f1757bed-1718-45e4-a731-11f1a3b4f068#033[00m
Jan 20 10:00:39 np0005588919 nova_compute[225855]: 2026-01-20 15:00:39.469 225859 DEBUG oslo_concurrency.lockutils [None req-de380fa1-243d-40f5-9128-16bf99a9f662 2a3fc2ba2a08423eb2e0bd7cf0fd5cf7 994b02a8c0094d2daa7b775b1f86f394 - - default default] Lock "f1757bed-1718-45e4-a731-11f1a3b4f068" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:39.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:40.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:40 np0005588919 nova_compute[225855]: 2026-01-20 15:00:40.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.260 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.261 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.283 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.379 225859 DEBUG nova.compute.manager [req-e3b24e08-7905-49db-a354-c3b2ec77f508 req-0fd169e9-9d1f-47de-b996-6fc8719ccac2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Received event network-vif-deleted-f62f622f-1d0a-4a68-9540-d1a7f48a66d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.381 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.381 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.387 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.387 225859 INFO nova.compute.claims [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.505 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:41.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.796 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/292610090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.975 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:41 np0005588919 nova_compute[225855]: 2026-01-20 15:00:41.984 225859 DEBUG nova.compute.provider_tree [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:42.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.617 225859 DEBUG nova.scheduler.client.report [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.671 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.672 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.728 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.729 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.760 225859 INFO nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.776 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.874 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.875 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.876 225859 INFO nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Creating image(s)#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.903 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.932 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.961 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:42 np0005588919 nova_compute[225855]: 2026-01-20 15:00:42.965 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.033 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.034 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.035 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.035 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.058 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.061 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.091 225859 DEBUG nova.policy [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37466ba8c9504f1ca6cfbce8add0b52a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41da7b7508634e869bbbe5203e7023cc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.334 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.272s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.363 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.364 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.364 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.406 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.406 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.414 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] resizing rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.523 225859 DEBUG nova.objects.instance [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'migration_context' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.563 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.564 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Ensure instance console log exists: /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.565 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.565 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:43 np0005588919 nova_compute[225855]: 2026-01-20 15:00:43.565 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:43.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:44.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:44 np0005588919 nova_compute[225855]: 2026-01-20 15:00:44.390 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Successfully created port: 086e4aee-1846-436c-8c93-dab333d31521 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:00:45 np0005588919 nova_compute[225855]: 2026-01-20 15:00:45.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:45.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:46 np0005588919 nova_compute[225855]: 2026-01-20 15:00:46.023 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:46.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:46 np0005588919 nova_compute[225855]: 2026-01-20 15:00:46.798 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:47 np0005588919 nova_compute[225855]: 2026-01-20 15:00:47.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 20 10:00:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:47.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.278 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Successfully updated port: 086e4aee-1846-436c-8c93-dab333d31521 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:00:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.387 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.388 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquired lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.388 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:00:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.638 225859 DEBUG nova.compute.manager [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-changed-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.638 225859 DEBUG nova.compute.manager [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Refreshing instance network info cache due to event network-changed-086e4aee-1846-436c-8c93-dab333d31521. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.639 225859 DEBUG oslo_concurrency.lockutils [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:00:48 np0005588919 nova_compute[225855]: 2026-01-20 15:00:48.672 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:00:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 20 10:00:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.370 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.370 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.370 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.371 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.578 225859 DEBUG nova.network.neutron [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updating instance_info_cache with network_info: [{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.611 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Releasing lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.612 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance network_info: |[{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.612 225859 DEBUG oslo_concurrency.lockutils [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.613 225859 DEBUG nova.network.neutron [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Refreshing network info cache for port 086e4aee-1846-436c-8c93-dab333d31521 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.616 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Start _get_guest_xml network_info=[{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.620 225859 WARNING nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.624 225859 DEBUG nova.virt.libvirt.host [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.625 225859 DEBUG nova.virt.libvirt.host [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.627 225859 DEBUG nova.virt.libvirt.host [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.628 225859 DEBUG nova.virt.libvirt.host [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.629 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.629 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.630 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.630 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.630 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.631 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.631 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.631 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.631 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.631 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.632 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.632 225859 DEBUG nova.virt.hardware [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.635 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2685457807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.803 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.955 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921235.9552252, f1757bed-1718-45e4-a731-11f1a3b4f068 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.956 225859 INFO nova.compute.manager [-] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.963 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.964 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4355MB free_disk=20.921974182128906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.964 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.965 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:50 np0005588919 nova_compute[225855]: 2026-01-20 15:00:50.983 225859 DEBUG nova.compute.manager [None req-afb7133f-56a8-4042-847e-bdfd3968424a - - - - - -] [instance: f1757bed-1718-45e4-a731-11f1a3b4f068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.026 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.038 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.039 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.039 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.045 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.087 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1371259099' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.113 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.140 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.144 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3543908004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.518 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.524 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.553 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2231245816' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.581 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.582 225859 DEBUG nova.virt.libvirt.vif [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:00:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1699199838',display_name='tempest-TestServerAdvancedOps-server-1699199838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1699199838',id=148,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41da7b7508634e869bbbe5203e7023cc',ramdisk_id='',reservation_id='r-y4cbtvvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1175826361',owner_user_name='tempest-TestServerAdvancedOps-1175826361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:42Z,user_data=None,user_id='37466ba8c9504f1ca6cfbce8add0b52a',uuid=3b9ae6db-82fd-4f0d-96f4-92a09c1c1677,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.583 225859 DEBUG nova.network.os_vif_util [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converting VIF {"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.583 225859 DEBUG nova.network.os_vif_util [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.584 225859 DEBUG nova.objects.instance [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.586 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.586 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.631 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <uuid>3b9ae6db-82fd-4f0d-96f4-92a09c1c1677</uuid>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <name>instance-00000094</name>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestServerAdvancedOps-server-1699199838</nova:name>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:00:50</nova:creationTime>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:user uuid="37466ba8c9504f1ca6cfbce8add0b52a">tempest-TestServerAdvancedOps-1175826361-project-member</nova:user>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:project uuid="41da7b7508634e869bbbe5203e7023cc">tempest-TestServerAdvancedOps-1175826361</nova:project>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <nova:port uuid="086e4aee-1846-436c-8c93-dab333d31521">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="serial">3b9ae6db-82fd-4f0d-96f4-92a09c1c1677</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="uuid">3b9ae6db-82fd-4f0d-96f4-92a09c1c1677</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:f3:aa:10"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <target dev="tap086e4aee-18"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/console.log" append="off"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:00:51 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:00:51 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:00:51 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:00:51 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.632 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Preparing to wait for external event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.633 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.633 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.633 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.634 225859 DEBUG nova.virt.libvirt.vif [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:00:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1699199838',display_name='tempest-TestServerAdvancedOps-server-1699199838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1699199838',id=148,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41da7b7508634e869bbbe5203e7023cc',ramdisk_id='',reservation_id='r-y4cbtvvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1175826361',owner_user_name='tempest-TestServerAdvancedOps-1175826361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:00:42Z,user_data=None,user_id='37466ba8c9504f1ca6cfbce8add0b52a',uuid=3b9ae6db-82fd-4f0d-96f4-92a09c1c1677,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.634 225859 DEBUG nova.network.os_vif_util [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converting VIF {"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.635 225859 DEBUG nova.network.os_vif_util [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.635 225859 DEBUG os_vif [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.636 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.636 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.639 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.639 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap086e4aee-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.639 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap086e4aee-18, col_values=(('external_ids', {'iface-id': '086e4aee-1846-436c-8c93-dab333d31521', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:aa:10', 'vm-uuid': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.640435) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251640552, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 772, "num_deletes": 253, "total_data_size": 1297546, "memory_usage": 1312440, "flush_reason": "Manual Compaction"}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 20 10:00:51 np0005588919 NetworkManager[49104]: <info>  [1768921251.6419] manager: (tap086e4aee-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.647 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251648708, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 855033, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54440, "largest_seqno": 55207, "table_properties": {"data_size": 851339, "index_size": 1474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8784, "raw_average_key_size": 19, "raw_value_size": 843815, "raw_average_value_size": 1904, "num_data_blocks": 66, "num_entries": 443, "num_filter_entries": 443, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921205, "oldest_key_time": 1768921205, "file_creation_time": 1768921251, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 8315 microseconds, and 3777 cpu microseconds.
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.648 225859 INFO os_vif [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18')#033[00m
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.648763) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 855033 bytes OK
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.648780) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650063) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650077) EVENT_LOG_v1 {"time_micros": 1768921251650073, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650098) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1293437, prev total WAL file size 1293437, number of live WAL files 2.
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650778) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(834KB)], [105(12MB)]
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251650853, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 13878054, "oldest_snapshot_seqno": -1}
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.704 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.705 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.705 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] No VIF found with MAC fa:16:3e:f3:aa:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.706 225859 INFO nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Using config drive#033[00m
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.733 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:51.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 8020 keys, 11993455 bytes, temperature: kUnknown
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251780434, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 11993455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11939269, "index_size": 33018, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20101, "raw_key_size": 208244, "raw_average_key_size": 25, "raw_value_size": 11795682, "raw_average_value_size": 1470, "num_data_blocks": 1295, "num_entries": 8020, "num_filter_entries": 8020, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921251, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.780698) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 11993455 bytes
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.782102) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 107.0 rd, 92.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.4 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(30.3) write-amplify(14.0) OK, records in: 8538, records dropped: 518 output_compression: NoCompression
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.782122) EVENT_LOG_v1 {"time_micros": 1768921251782113, "job": 66, "event": "compaction_finished", "compaction_time_micros": 129641, "compaction_time_cpu_micros": 32241, "output_level": 6, "num_output_files": 1, "total_output_size": 11993455, "num_input_records": 8538, "num_output_records": 8020, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251782350, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921251784279, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.650657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.784315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.784319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.784320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.784321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:00:51.784323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:00:51 np0005588919 nova_compute[225855]: 2026-01-20 15:00:51.800 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.067 225859 DEBUG nova.network.neutron [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updated VIF entry in instance network info cache for port 086e4aee-1846-436c-8c93-dab333d31521. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.067 225859 DEBUG nova.network.neutron [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updating instance_info_cache with network_info: [{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.088 225859 DEBUG oslo_concurrency.lockutils [req-77b38338-1360-4fd3-9c19-63944101294d req-5d9fb130-7c6b-4c98-961a-5977116485f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.237 225859 INFO nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Creating config drive at /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.243 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpssu39_w3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:52.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.378 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpssu39_w3" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.405 225859 DEBUG nova.storage.rbd_utils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] rbd image 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.410 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.611 225859 DEBUG oslo_concurrency.processutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.612 225859 INFO nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Deleting local config drive /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677/disk.config because it was imported into RBD.#033[00m
Jan 20 10:00:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e326 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:52 np0005588919 kernel: tap086e4aee-18: entered promiscuous mode
Jan 20 10:00:52 np0005588919 NetworkManager[49104]: <info>  [1768921252.6591] manager: (tap086e4aee-18): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Jan 20 10:00:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:52Z|00590|binding|INFO|Claiming lport 086e4aee-1846-436c-8c93-dab333d31521 for this chassis.
Jan 20 10:00:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:52Z|00591|binding|INFO|086e4aee-1846-436c-8c93-dab333d31521: Claiming fa:16:3e:f3:aa:10 10.100.0.13
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.662 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:52 np0005588919 systemd-machined[194361]: New machine qemu-69-instance-00000094.
Jan 20 10:00:52 np0005588919 systemd-udevd[284411]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:00:52 np0005588919 systemd[1]: Started Virtual Machine qemu-69-instance-00000094.
Jan 20 10:00:52 np0005588919 NetworkManager[49104]: <info>  [1768921252.7086] device (tap086e4aee-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:00:52 np0005588919 NetworkManager[49104]: <info>  [1768921252.7124] device (tap086e4aee-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:00:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:52Z|00592|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 ovn-installed in OVS
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:52 np0005588919 nova_compute[225855]: 2026-01-20 15:00:52.714 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:52Z|00593|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 up in Southbound
Jan 20 10:00:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:52.749 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:52.751 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe bound to our chassis#033[00m
Jan 20 10:00:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:52.751 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:00:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:52.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62fd0905-5e2d-421f-b9fe-7a6508ecdbc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:52 np0005588919 podman[284402]: 2026-01-20 15:00:52.772266042 +0000 UTC m=+0.090548035 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.206 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921253.2063863, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.207 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Started (Lifecycle Event)#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.446 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.451 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921253.2092814, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.452 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.633 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.637 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.678 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:53.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.941 225859 DEBUG nova.compute.manager [req-577aa2c7-4203-4617-9e55-ff97da7b172e req-8670956c-2d1b-4bdc-b464-5f2482a6bab5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.942 225859 DEBUG oslo_concurrency.lockutils [req-577aa2c7-4203-4617-9e55-ff97da7b172e req-8670956c-2d1b-4bdc-b464-5f2482a6bab5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.942 225859 DEBUG oslo_concurrency.lockutils [req-577aa2c7-4203-4617-9e55-ff97da7b172e req-8670956c-2d1b-4bdc-b464-5f2482a6bab5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.942 225859 DEBUG oslo_concurrency.lockutils [req-577aa2c7-4203-4617-9e55-ff97da7b172e req-8670956c-2d1b-4bdc-b464-5f2482a6bab5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.942 225859 DEBUG nova.compute.manager [req-577aa2c7-4203-4617-9e55-ff97da7b172e req-8670956c-2d1b-4bdc-b464-5f2482a6bab5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Processing event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.943 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.946 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921253.9464118, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.946 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.948 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.951 225859 INFO nova.virt.libvirt.driver [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance spawned successfully.#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.951 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.966 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.972 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.975 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.976 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.976 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.977 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.977 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:53 np0005588919 nova_compute[225855]: 2026-01-20 15:00:53.977 225859 DEBUG nova.virt.libvirt.driver [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.018 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.048 225859 INFO nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Took 11.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.049 225859 DEBUG nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.140 225859 INFO nova.compute.manager [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Took 12.78 seconds to build instance.#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.161 225859 DEBUG oslo_concurrency.lockutils [None req-6782266e-ea27-401c-a11a-7f70c6603bcc 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.582 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:54 np0005588919 nova_compute[225855]: 2026-01-20 15:00:54.582 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:00:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 20 10:00:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:00:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:55.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.183 225859 DEBUG nova.compute.manager [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.184 225859 DEBUG oslo_concurrency.lockutils [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.187 225859 DEBUG oslo_concurrency.lockutils [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.188 225859 DEBUG oslo_concurrency.lockutils [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.188 225859 DEBUG nova.compute.manager [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.188 225859 WARNING nova.compute.manager [req-d501a09c-d195-4c14-8170-f1497ef13b7a req-3b29a097-c578-4f0d-b25a-7f2bdc104690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:00:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:00:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.641 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:56 np0005588919 nova_compute[225855]: 2026-01-20 15:00:56.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.015 225859 DEBUG nova.objects.instance [None req-acfd2f9b-c323-4854-88e3-dd45ac87b1ba 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.042 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921257.0416844, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.042 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.128 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.133 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:00:57 np0005588919 nova_compute[225855]: 2026-01-20 15:00:57.209 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 10:00:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:57.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:00:58 np0005588919 kernel: tap086e4aee-18 (unregistering): left promiscuous mode
Jan 20 10:00:58 np0005588919 NetworkManager[49104]: <info>  [1768921258.0392] device (tap086e4aee-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:00:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:58Z|00594|binding|INFO|Releasing lport 086e4aee-1846-436c-8c93-dab333d31521 from this chassis (sb_readonly=0)
Jan 20 10:00:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:58Z|00595|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 down in Southbound
Jan 20 10:00:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:00:58Z|00596|binding|INFO|Removing iface tap086e4aee-18 ovn-installed in OVS
Jan 20 10:00:58 np0005588919 nova_compute[225855]: 2026-01-20 15:00:58.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:58 np0005588919 nova_compute[225855]: 2026-01-20 15:00:58.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:58 np0005588919 nova_compute[225855]: 2026-01-20 15:00:58.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:58 np0005588919 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 20 10:00:58 np0005588919 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000094.scope: Consumed 3.781s CPU time.
Jan 20 10:00:58 np0005588919 systemd-machined[194361]: Machine qemu-69-instance-00000094 terminated.
Jan 20 10:00:58 np0005588919 kernel: tap086e4aee-18: entered promiscuous mode
Jan 20 10:00:58 np0005588919 kernel: tap086e4aee-18 (unregistering): left promiscuous mode
Jan 20 10:00:58 np0005588919 NetworkManager[49104]: <info>  [1768921258.1998] manager: (tap086e4aee-18): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Jan 20 10:00:58 np0005588919 nova_compute[225855]: 2026-01-20 15:00:58.204 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:00:58 np0005588919 nova_compute[225855]: 2026-01-20 15:00:58.219 225859 DEBUG nova.compute.manager [None req-acfd2f9b-c323-4854-88e3-dd45ac87b1ba 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:00:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:00:58.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:00:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:58.430 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:00:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:58.431 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe unbound from our chassis#033[00m
Jan 20 10:00:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:58.432 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:00:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:00:58.432 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e7eb9be5-17c3-41e8-b5cd-ecf36e87347d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:00:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:00:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:00:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:00:59.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:00.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.766 225859 DEBUG nova.compute.manager [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.766 225859 DEBUG oslo_concurrency.lockutils [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.767 225859 DEBUG oslo_concurrency.lockutils [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.767 225859 DEBUG oslo_concurrency.lockutils [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.767 225859 DEBUG nova.compute.manager [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:00 np0005588919 nova_compute[225855]: 2026-01-20 15:01:00.767 225859 WARNING nova.compute.manager [req-ee99b29b-8b3c-4f9e-bba9-14d13623edc5 req-a18aa099-4f65-4c5c-903a-7756c3beb7c4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state suspended and task_state None.#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:01.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.803 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.944 225859 INFO nova.compute.manager [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Resuming#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.945 225859 DEBUG nova.objects.instance [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'flavor' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:01.964 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.967 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:01.967 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.994 225859 DEBUG oslo_concurrency.lockutils [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.995 225859 DEBUG oslo_concurrency.lockutils [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquired lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:01 np0005588919 nova_compute[225855]: 2026-01-20 15:01:01.995 225859 DEBUG nova.network.neutron [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:02.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.160 225859 DEBUG nova.compute.manager [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.161 225859 DEBUG oslo_concurrency.lockutils [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.161 225859 DEBUG oslo_concurrency.lockutils [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.161 225859 DEBUG oslo_concurrency.lockutils [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.161 225859 DEBUG nova.compute.manager [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.162 225859 WARNING nova.compute.manager [req-ee07a9b2-5bc5-4988-bee9-7bf6b5e67f1b req-630d6936-cad2-4ad4-b4d1-8879c5ed9727 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 10:01:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:03.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:03 np0005588919 nova_compute[225855]: 2026-01-20 15:01:03.930 225859 DEBUG nova.network.neutron [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updating instance_info_cache with network_info: [{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.086 225859 DEBUG oslo_concurrency.lockutils [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Releasing lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.090 225859 DEBUG nova.virt.libvirt.vif [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:00:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1699199838',display_name='tempest-TestServerAdvancedOps-server-1699199838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1699199838',id=148,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:00:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='41da7b7508634e869bbbe5203e7023cc',ramdisk_id='',reservation_id='r-y4cbtvvj',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1175826361',owner_user_name='tempest-TestServerAdvancedOps-1175826361-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:00:59Z,user_data=None,user_id='37466ba8c9504f1ca6cfbce8add0b52a',uuid=3b9ae6db-82fd-4f0d-96f4-92a09c1c1677,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.090 225859 DEBUG nova.network.os_vif_util [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converting VIF {"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.091 225859 DEBUG nova.network.os_vif_util [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.092 225859 DEBUG os_vif [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.092 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.093 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.093 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.096 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap086e4aee-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.096 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap086e4aee-18, col_values=(('external_ids', {'iface-id': '086e4aee-1846-436c-8c93-dab333d31521', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:aa:10', 'vm-uuid': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.097 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.097 225859 INFO os_vif [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18')#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.119 225859 DEBUG nova.objects.instance [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'numa_topology' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:04 np0005588919 podman[284701]: 2026-01-20 15:01:04.186712553 +0000 UTC m=+0.054649643 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:01:04 np0005588919 kernel: tap086e4aee-18: entered promiscuous mode
Jan 20 10:01:04 np0005588919 NetworkManager[49104]: <info>  [1768921264.2201] manager: (tap086e4aee-18): new Tun device (/org/freedesktop/NetworkManager/Devices/258)
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.220 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:04Z|00597|binding|INFO|Claiming lport 086e4aee-1846-436c-8c93-dab333d31521 for this chassis.
Jan 20 10:01:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:04Z|00598|binding|INFO|086e4aee-1846-436c-8c93-dab333d31521: Claiming fa:16:3e:f3:aa:10 10.100.0.13
Jan 20 10:01:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:04.227 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:04.228 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe bound to our chassis#033[00m
Jan 20 10:01:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:04.228 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:01:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:04.229 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7f183e-27ee-4f28-8fcb-7eea23b11925]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:04Z|00599|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 up in Southbound
Jan 20 10:01:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:04Z|00600|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 ovn-installed in OVS
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.238 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.242 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:04 np0005588919 systemd-udevd[284734]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:04 np0005588919 systemd-machined[194361]: New machine qemu-70-instance-00000094.
Jan 20 10:01:04 np0005588919 NetworkManager[49104]: <info>  [1768921264.2645] device (tap086e4aee-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:04 np0005588919 NetworkManager[49104]: <info>  [1768921264.2657] device (tap086e4aee-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:04 np0005588919 systemd[1]: Started Virtual Machine qemu-70-instance-00000094.
Jan 20 10:01:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:01:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:01:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:01:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.745 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.746 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921264.7451344, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.746 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.762 225859 DEBUG nova.compute.manager [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:04 np0005588919 nova_compute[225855]: 2026-01-20 15:01:04.762 225859 DEBUG nova.objects.instance [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.754 225859 DEBUG nova.compute.manager [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.754 225859 DEBUG oslo_concurrency.lockutils [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.754 225859 DEBUG oslo_concurrency.lockutils [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.755 225859 DEBUG oslo_concurrency.lockutils [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.755 225859 DEBUG nova.compute.manager [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.755 225859 WARNING nova.compute.manager [req-0ea6a4cc-828e-4437-9e6f-8499645d0a88 req-bf8f50ec-897d-492a-bf84-c95d8346946a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.770 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.772 225859 INFO nova.virt.libvirt.driver [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance running successfully.#033[00m
Jan 20 10:01:05 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.776 225859 DEBUG nova.virt.libvirt.guest [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.776 225859 DEBUG nova.compute.manager [None req-db0cc92f-8ea4-468c-9d7a-2b3c1a447a80 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.777 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:05.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.844 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.845 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921264.748487, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:05 np0005588919 nova_compute[225855]: 2026-01-20 15:01:05.845 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:06 np0005588919 nova_compute[225855]: 2026-01-20 15:01:06.079 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:06 np0005588919 nova_compute[225855]: 2026-01-20 15:01:06.083 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:06.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:06 np0005588919 nova_compute[225855]: 2026-01-20 15:01:06.647 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:06 np0005588919 nova_compute[225855]: 2026-01-20 15:01:06.805 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:06.969 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:07.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.873 225859 DEBUG nova.compute.manager [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.873 225859 DEBUG oslo_concurrency.lockutils [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.874 225859 DEBUG oslo_concurrency.lockutils [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.874 225859 DEBUG oslo_concurrency.lockutils [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.874 225859 DEBUG nova.compute.manager [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:07 np0005588919 nova_compute[225855]: 2026-01-20 15:01:07.875 225859 WARNING nova.compute.manager [req-7f785eed-1da4-4bf4-888e-37973a098707 req-a4ba6737-a010-498e-917e-1248fd07739a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:08.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:09.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:09 np0005588919 nova_compute[225855]: 2026-01-20 15:01:09.937 225859 DEBUG nova.objects.instance [None req-7149fc9e-4a24-474f-bf7d-a9deb298197c 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:09 np0005588919 nova_compute[225855]: 2026-01-20 15:01:09.957 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921269.957014, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:09 np0005588919 nova_compute[225855]: 2026-01-20 15:01:09.958 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:01:09 np0005588919 nova_compute[225855]: 2026-01-20 15:01:09.984 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:09 np0005588919 nova_compute[225855]: 2026-01-20 15:01:09.988 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.021 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 20 10:01:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:10.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:10 np0005588919 kernel: tap086e4aee-18 (unregistering): left promiscuous mode
Jan 20 10:01:10 np0005588919 NetworkManager[49104]: <info>  [1768921270.6107] device (tap086e4aee-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:01:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:10Z|00601|binding|INFO|Releasing lport 086e4aee-1846-436c-8c93-dab333d31521 from this chassis (sb_readonly=0)
Jan 20 10:01:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:10Z|00602|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 down in Southbound
Jan 20 10:01:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:10Z|00603|binding|INFO|Removing iface tap086e4aee-18 ovn-installed in OVS
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.619 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:10.623 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:10.624 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe unbound from our chassis#033[00m
Jan 20 10:01:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:10.624 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:01:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:10.625 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[43e0694b-53d3-4f1f-8dc4-96a7341432fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.635 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:10 np0005588919 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 20 10:01:10 np0005588919 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000094.scope: Consumed 5.782s CPU time.
Jan 20 10:01:10 np0005588919 systemd-machined[194361]: Machine qemu-70-instance-00000094 terminated.
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.795 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:10 np0005588919 nova_compute[225855]: 2026-01-20 15:01:10.807 225859 DEBUG nova.compute.manager [None req-7149fc9e-4a24-474f-bf7d-a9deb298197c 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:11 np0005588919 nova_compute[225855]: 2026-01-20 15:01:11.649 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:11.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:11 np0005588919 nova_compute[225855]: 2026-01-20 15:01:11.807 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:12.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:13.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:14 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:01:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:14 np0005588919 nova_compute[225855]: 2026-01-20 15:01:14.420 225859 INFO nova.compute.manager [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Resuming#033[00m
Jan 20 10:01:14 np0005588919 nova_compute[225855]: 2026-01-20 15:01:14.420 225859 DEBUG nova.objects.instance [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'flavor' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:14 np0005588919 nova_compute[225855]: 2026-01-20 15:01:14.545 225859 DEBUG oslo_concurrency.lockutils [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:14 np0005588919 nova_compute[225855]: 2026-01-20 15:01:14.545 225859 DEBUG oslo_concurrency.lockutils [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquired lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:14 np0005588919 nova_compute[225855]: 2026-01-20 15:01:14.546 225859 DEBUG nova.network.neutron [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:15.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:16.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.421 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.422 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.422 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.615 225859 DEBUG nova.network.neutron [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updating instance_info_cache with network_info: [{"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.626 225859 DEBUG nova.compute.manager [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.627 225859 DEBUG oslo_concurrency.lockutils [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.627 225859 DEBUG oslo_concurrency.lockutils [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.627 225859 DEBUG oslo_concurrency.lockutils [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.627 225859 DEBUG nova.compute.manager [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.627 225859 WARNING nova.compute.manager [req-ecc20836-0a95-46f5-9121-0e05ea05b3bb req-30cfcb20-0d16-4e32-9e9c-e261c0c2fb0c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.636 225859 DEBUG oslo_concurrency.lockutils [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Releasing lock "refresh_cache-3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.641 225859 DEBUG nova.virt.libvirt.vif [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:00:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1699199838',display_name='tempest-TestServerAdvancedOps-server-1699199838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1699199838',id=148,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:00:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='41da7b7508634e869bbbe5203e7023cc',ramdisk_id='',reservation_id='r-y4cbtvvj',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1175826361',owner_user_name='tempest-TestServerAdvancedOps-1175826361-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:01:10Z,user_data=None,user_id='37466ba8c9504f1ca6cfbce8add0b52a',uuid=3b9ae6db-82fd-4f0d-96f4-92a09c1c1677,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.642 225859 DEBUG nova.network.os_vif_util [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converting VIF {"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.643 225859 DEBUG nova.network.os_vif_util [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.643 225859 DEBUG os_vif [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.643 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.644 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.644 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.647 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.648 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap086e4aee-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.648 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap086e4aee-18, col_values=(('external_ids', {'iface-id': '086e4aee-1846-436c-8c93-dab333d31521', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:aa:10', 'vm-uuid': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.649 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.650 225859 INFO os_vif [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18')#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.652 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.670 225859 DEBUG nova.objects.instance [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'numa_topology' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:16 np0005588919 kernel: tap086e4aee-18: entered promiscuous mode
Jan 20 10:01:16 np0005588919 NetworkManager[49104]: <info>  [1768921276.7315] manager: (tap086e4aee-18): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Jan 20 10:01:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:16Z|00604|binding|INFO|Claiming lport 086e4aee-1846-436c-8c93-dab333d31521 for this chassis.
Jan 20 10:01:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:16Z|00605|binding|INFO|086e4aee-1846-436c-8c93-dab333d31521: Claiming fa:16:3e:f3:aa:10 10.100.0.13
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.742 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '7', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.743 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe bound to our chassis#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.744 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:01:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:16.745 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9832d3-5794-462c-86de-3bfcfb2d3620]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:16Z|00606|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 up in Southbound
Jan 20 10:01:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:16Z|00607|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 ovn-installed in OVS
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.758 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 systemd-udevd[284924]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:16 np0005588919 systemd-machined[194361]: New machine qemu-71-instance-00000094.
Jan 20 10:01:16 np0005588919 NetworkManager[49104]: <info>  [1768921276.7741] device (tap086e4aee-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:16 np0005588919 NetworkManager[49104]: <info>  [1768921276.7753] device (tap086e4aee-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:16 np0005588919 systemd[1]: Started Virtual Machine qemu-71-instance-00000094.
Jan 20 10:01:16 np0005588919 nova_compute[225855]: 2026-01-20 15:01:16.809 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.558 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.558 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921277.5577242, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.558 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.577 225859 DEBUG nova.compute.manager [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.578 225859 DEBUG nova.objects.instance [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.584 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.588 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.595 225859 INFO nova.virt.libvirt.driver [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance running successfully.#033[00m
Jan 20 10:01:17 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.597 225859 DEBUG nova.virt.libvirt.guest [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.597 225859 DEBUG nova.compute.manager [None req-c40f9667-ac01-4bfc-952d-90e51ac5e7f5 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.631 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.632 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921277.5636852, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.632 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.658 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:17 np0005588919 nova_compute[225855]: 2026-01-20 15:01:17.667 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:17.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1919300651' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:18.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.738 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.739 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.739 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.739 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.739 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.740 225859 WARNING nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.740 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.740 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.740 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.740 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.741 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.741 225859 WARNING nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.741 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.741 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.742 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.742 225859 DEBUG oslo_concurrency.lockutils [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.742 225859 DEBUG nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.742 225859 WARNING nova.compute.manager [req-e08989d6-c841-449f-9dc4-7bacc87b7f5e req-90f136ed-61af-40cb-b507-138ff93265a2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.953 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.954 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.954 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.955 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.955 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.956 225859 INFO nova.compute.manager [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Terminating instance#033[00m
Jan 20 10:01:18 np0005588919 nova_compute[225855]: 2026-01-20 15:01:18.958 225859 DEBUG nova.compute.manager [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:01:18 np0005588919 kernel: tap086e4aee-18 (unregistering): left promiscuous mode
Jan 20 10:01:18 np0005588919 NetworkManager[49104]: <info>  [1768921278.9975] device (tap086e4aee-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:01:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:19Z|00608|binding|INFO|Releasing lport 086e4aee-1846-436c-8c93-dab333d31521 from this chassis (sb_readonly=0)
Jan 20 10:01:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:19Z|00609|binding|INFO|Setting lport 086e4aee-1846-436c-8c93-dab333d31521 down in Southbound
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:19Z|00610|binding|INFO|Removing iface tap086e4aee-18 ovn-installed in OVS
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:19.013 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:aa:10 10.100.0.13'], port_security=['fa:16:3e:f3:aa:10 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b9ae6db-82fd-4f0d-96f4-92a09c1c1677', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e442dddb-90bf-46c8-b680-3f7b90171ffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41da7b7508634e869bbbe5203e7023cc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9fe8372e-13b8-4476-ba27-8f6ac71e4da5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0874ec4-826b-4e92-aca5-efa961e93290, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=086e4aee-1846-436c-8c93-dab333d31521) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:19.014 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 086e4aee-1846-436c-8c93-dab333d31521 in datapath e442dddb-90bf-46c8-b680-3f7b90171ffe unbound from our chassis#033[00m
Jan 20 10:01:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:19.015 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e442dddb-90bf-46c8-b680-3f7b90171ffe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:01:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:19.015 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c077dda9-8c32-40d7-9225-79a6b2ff0b79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 20 10:01:19 np0005588919 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000094.scope: Consumed 2.122s CPU time.
Jan 20 10:01:19 np0005588919 systemd-machined[194361]: Machine qemu-71-instance-00000094 terminated.
Jan 20 10:01:19 np0005588919 NetworkManager[49104]: <info>  [1768921279.1731] manager: (tap086e4aee-18): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.175 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.181 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.192 225859 INFO nova.virt.libvirt.driver [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Instance destroyed successfully.#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.192 225859 DEBUG nova.objects.instance [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lazy-loading 'resources' on Instance uuid 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.213 225859 DEBUG nova.virt.libvirt.vif [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:00:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1699199838',display_name='tempest-TestServerAdvancedOps-server-1699199838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1699199838',id=148,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:00:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='41da7b7508634e869bbbe5203e7023cc',ramdisk_id='',reservation_id='r-y4cbtvvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-1175826361',owner_user_name='tempest-TestServerAdvancedOps-1175826361-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:01:17Z,user_data=None,user_id='37466ba8c9504f1ca6cfbce8add0b52a',uuid=3b9ae6db-82fd-4f0d-96f4-92a09c1c1677,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.213 225859 DEBUG nova.network.os_vif_util [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converting VIF {"id": "086e4aee-1846-436c-8c93-dab333d31521", "address": "fa:16:3e:f3:aa:10", "network": {"id": "e442dddb-90bf-46c8-b680-3f7b90171ffe", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1981769764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "41da7b7508634e869bbbe5203e7023cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086e4aee-18", "ovs_interfaceid": "086e4aee-1846-436c-8c93-dab333d31521", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.214 225859 DEBUG nova.network.os_vif_util [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.214 225859 DEBUG os_vif [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.216 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap086e4aee-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.219 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:19 np0005588919 nova_compute[225855]: 2026-01-20 15:01:19.222 225859 INFO os_vif [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:aa:10,bridge_name='br-int',has_traffic_filtering=True,id=086e4aee-1846-436c-8c93-dab333d31521,network=Network(e442dddb-90bf-46c8-b680-3f7b90171ffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086e4aee-18')#033[00m
Jan 20 10:01:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:19.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:20.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.882 225859 DEBUG nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.882 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.883 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.883 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.883 225859 DEBUG nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.883 225859 DEBUG nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-unplugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.883 225859 DEBUG nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.884 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.884 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.884 225859 DEBUG oslo_concurrency.lockutils [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.884 225859 DEBUG nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] No waiting events found dispatching network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:20 np0005588919 nova_compute[225855]: 2026-01-20 15:01:20.884 225859 WARNING nova.compute.manager [req-12c895fc-f6d0-45e9-b45a-eaedac3f0c2f req-422b6e7f-7cf2-4729-9e64-f4a924479861 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received unexpected event network-vif-plugged-086e4aee-1846-436c-8c93-dab333d31521 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:01:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:21.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.803 225859 INFO nova.virt.libvirt.driver [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Deleting instance files /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_del#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.804 225859 INFO nova.virt.libvirt.driver [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Deletion of /var/lib/nova/instances/3b9ae6db-82fd-4f0d-96f4-92a09c1c1677_del complete#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.872 225859 INFO nova.compute.manager [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Took 2.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.873 225859 DEBUG oslo.service.loopingcall [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.873 225859 DEBUG nova.compute.manager [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:01:21 np0005588919 nova_compute[225855]: 2026-01-20 15:01:21.873 225859 DEBUG nova.network.neutron [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:01:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:22.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 20 10:01:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:23 np0005588919 podman[285012]: 2026-01-20 15:01:23.04862288 +0000 UTC m=+0.086093240 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.177 225859 DEBUG nova.network.neutron [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.201 225859 INFO nova.compute.manager [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.263 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.263 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.285 225859 DEBUG nova.compute.manager [req-d3ea38bc-5b53-4be2-a4cb-a2443637b348 req-74bca486-55c1-4583-a28d-8f87d8eadfd6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Received event network-vif-deleted-086e4aee-1846-436c-8c93-dab333d31521 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.328 225859 DEBUG oslo_concurrency.processutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4191032683' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:23.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.803 225859 DEBUG oslo_concurrency.processutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.810 225859 DEBUG nova.compute.provider_tree [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.825 225859 DEBUG nova.scheduler.client.report [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.847 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.885 225859 INFO nova.scheduler.client.report [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Deleted allocations for instance 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677#033[00m
Jan 20 10:01:23 np0005588919 nova_compute[225855]: 2026-01-20 15:01:23.966 225859 DEBUG oslo_concurrency.lockutils [None req-b57706f9-218c-42cb-8795-8e80f5b2bba6 37466ba8c9504f1ca6cfbce8add0b52a 41da7b7508634e869bbbe5203e7023cc - - default default] Lock "3b9ae6db-82fd-4f0d-96f4-92a09c1c1677" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:24 np0005588919 nova_compute[225855]: 2026-01-20 15:01:24.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:24.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:25 np0005588919 nova_compute[225855]: 2026-01-20 15:01:25.455 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:25.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:26.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:26 np0005588919 nova_compute[225855]: 2026-01-20 15:01:26.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:27.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:28.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:29 np0005588919 nova_compute[225855]: 2026-01-20 15:01:29.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:29.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 20 10:01:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:30.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:31.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:31 np0005588919 nova_compute[225855]: 2026-01-20 15:01:31.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:32.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 20 10:01:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:33.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:34 np0005588919 nova_compute[225855]: 2026-01-20 15:01:34.190 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921279.1881814, 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:34 np0005588919 nova_compute[225855]: 2026-01-20 15:01:34.190 225859 INFO nova.compute.manager [-] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:01:34 np0005588919 nova_compute[225855]: 2026-01-20 15:01:34.222 225859 DEBUG nova.compute.manager [None req-248dfd82-efa5-48ad-9e2c-6f8acd429ecb - - - - - -] [instance: 3b9ae6db-82fd-4f0d-96f4-92a09c1c1677] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:34 np0005588919 nova_compute[225855]: 2026-01-20 15:01:34.226 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:34.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:34 np0005588919 podman[285067]: 2026-01-20 15:01:34.998289812 +0000 UTC m=+0.045720491 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:01:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 20 10:01:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:35.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:36.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 20 10:01:36 np0005588919 nova_compute[225855]: 2026-01-20 15:01:36.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:38.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:39 np0005588919 nova_compute[225855]: 2026-01-20 15:01:39.229 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:39.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:40.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:41.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:41 np0005588919 nova_compute[225855]: 2026-01-20 15:01:41.862 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:42 np0005588919 nova_compute[225855]: 2026-01-20 15:01:42.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:42 np0005588919 nova_compute[225855]: 2026-01-20 15:01:42.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:42 np0005588919 nova_compute[225855]: 2026-01-20 15:01:42.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:01:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:42.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:43 np0005588919 nova_compute[225855]: 2026-01-20 15:01:43.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:43.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.233 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.355 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:01:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:44.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.382 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.382 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.401 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.480 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.481 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.488 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.488 225859 INFO nova.compute.claims [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.675 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.888 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.888 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.903 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:01:44 np0005588919 nova_compute[225855]: 2026-01-20 15:01:44.956 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 20 10:01:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3249063162' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.164 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.169 225859 DEBUG nova.compute.provider_tree [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.188 225859 DEBUG nova.scheduler.client.report [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.209 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.210 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.212 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.217 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.218 225859 INFO nova.compute.claims [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.283 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.283 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.300 225859 INFO nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.314 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.347 225859 INFO nova.virt.block_device [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Booting with blank volume at /dev/vda#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.375 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.557 225859 DEBUG nova.policy [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2446e8399b344b29986c1aaf8bf73adf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63555e5851564db08c6429231d264f2c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:01:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1401472554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.785 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.792 225859 DEBUG nova.compute.provider_tree [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.813 225859 DEBUG nova.scheduler.client.report [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:45.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.841 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.842 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.907 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.908 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.928 225859 INFO nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:01:45 np0005588919 nova_compute[225855]: 2026-01-20 15:01:45.952 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.087 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.088 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.089 225859 INFO nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Creating image(s)#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.114 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.144 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.174 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.178 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.205 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Successfully created port: 244332ba-1b58-4d42-98b0-245f9460c50f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.214 225859 DEBUG nova.policy [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1654794111844ca88666b3529173e9a7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a1d679d5c954662a271e842fe2f2c05', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.241 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.242 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.243 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.243 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.298 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.302 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:46.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.754 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.821 225859 DEBUG os_brick.utils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.823 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.827 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] resizing rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.835 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.835 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[5e313ee8-c423-428e-8988-2ec8c1e58971]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.862 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.863 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.871 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.872 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[981660ef-7e3d-4cc2-bd78-766703893b6f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.874 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.884 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.885 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[09a1c65b-2645-4706-9cae-90bb459c0d67]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.886 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[26bb15ab-c521-4600-b569-a0cd2146e509]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.887 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.919 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.921 225859 DEBUG os_brick.initiator.connectors.lightos [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.922 225859 DEBUG os_brick.initiator.connectors.lightos [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.922 225859 DEBUG os_brick.initiator.connectors.lightos [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.922 225859 DEBUG os_brick.utils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] <== get_connector_properties: return (100ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.923 225859 DEBUG nova.virt.block_device [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating existing volume attachment record: 20658306-e0e7-4d9c-a904-24cfdd1b82ee _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.967 225859 DEBUG nova.objects.instance [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.980 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.980 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Ensure instance console log exists: /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.981 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.981 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:46 np0005588919 nova_compute[225855]: 2026-01-20 15:01:46.982 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.327 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Successfully created port: 6216baae-337d-44a3-aa38-60c2afb5d13f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.333 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Successfully updated port: 244332ba-1b58-4d42-98b0-245f9460c50f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.354 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.354 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.355 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.444 225859 DEBUG nova.compute.manager [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-changed-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.445 225859 DEBUG nova.compute.manager [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Refreshing instance network info cache due to event network-changed-244332ba-1b58-4d42-98b0-245f9460c50f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.445 225859 DEBUG oslo_concurrency.lockutils [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:47 np0005588919 nova_compute[225855]: 2026-01-20 15:01:47.645 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:01:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:47.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.033 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.035 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.035 225859 INFO nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Creating image(s)#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.035 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.036 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Ensure instance console log exists: /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.036 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.036 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.037 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:01:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:48.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.776 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Successfully updated port: 6216baae-337d-44a3-aa38-60c2afb5d13f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.803 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.804 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquired lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:48 np0005588919 nova_compute[225855]: 2026-01-20 15:01:48.804 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.029 225859 DEBUG nova.compute.manager [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.029 225859 DEBUG nova.compute.manager [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing instance network info cache due to event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.029 225859 DEBUG oslo_concurrency.lockutils [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.073 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.218 225859 DEBUG nova.network.neutron [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.257 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.257 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance network_info: |[{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.258 225859 DEBUG oslo_concurrency.lockutils [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.258 225859 DEBUG nova.network.neutron [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Refreshing network info cache for port 244332ba-1b58-4d42-98b0-245f9460c50f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.262 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start _get_guest_xml network_info=[{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-41743468-7add-45cb-bc94-02eb6f850278', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '41743468-7add-45cb-bc94-02eb6f850278', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '474cec75-3b01-411a-9074-75859d2a9ddf', 'attached_at': '', 'detached_at': '', 'volume_id': '41743468-7add-45cb-bc94-02eb6f850278', 'serial': '41743468-7add-45cb-bc94-02eb6f850278'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '20658306-e0e7-4d9c-a904-24cfdd1b82ee', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.279 225859 WARNING nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.283 225859 DEBUG nova.virt.libvirt.host [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.283 225859 DEBUG nova.virt.libvirt.host [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.291 225859 DEBUG nova.virt.libvirt.host [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.292 225859 DEBUG nova.virt.libvirt.host [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.293 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.293 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.293 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.294 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.294 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.294 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.294 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.294 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.295 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.295 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.295 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.295 225859 DEBUG nova.virt.hardware [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.324 225859 DEBUG nova.storage.rbd_utils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.329 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:01:49 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/193634755' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.760 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.786 225859 DEBUG nova.virt.libvirt.vif [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-254746207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-254746207',id=150,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-solng1yz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:45Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=474cec75-3b01-411a-9074-75859d2a9ddf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.787 225859 DEBUG nova.network.os_vif_util [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.788 225859 DEBUG nova.network.os_vif_util [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.789 225859 DEBUG nova.objects.instance [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.803 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <uuid>474cec75-3b01-411a-9074-75859d2a9ddf</uuid>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <name>instance-00000096</name>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-254746207</nova:name>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:01:49</nova:creationTime>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:user uuid="2446e8399b344b29986c1aaf8bf73adf">tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member</nova:user>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:project uuid="63555e5851564db08c6429231d264f2c">tempest-ServerBootFromVolumeStableRescueTest-1871371328</nova:project>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <nova:port uuid="244332ba-1b58-4d42-98b0-245f9460c50f">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="serial">474cec75-3b01-411a-9074-75859d2a9ddf</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="uuid">474cec75-3b01-411a-9074-75859d2a9ddf</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/474cec75-3b01-411a-9074-75859d2a9ddf_disk.config">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-41743468-7add-45cb-bc94-02eb6f850278">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <serial>41743468-7add-45cb-bc94-02eb6f850278</serial>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:6f:36:24"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <target dev="tap244332ba-1b"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/console.log" append="off"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:01:49 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:01:49 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:01:49 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:01:49 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.804 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Preparing to wait for external event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.805 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.805 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.805 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.806 225859 DEBUG nova.virt.libvirt.vif [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-254746207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-254746207',id=150,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-solng1yz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:45Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=474cec75-3b01-411a-9074-75859d2a9ddf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.806 225859 DEBUG nova.network.os_vif_util [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.807 225859 DEBUG nova.network.os_vif_util [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.807 225859 DEBUG os_vif [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.808 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.808 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.809 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.812 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap244332ba-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.812 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap244332ba-1b, col_values=(('external_ids', {'iface-id': '244332ba-1b58-4d42-98b0-245f9460c50f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:36:24', 'vm-uuid': '474cec75-3b01-411a-9074-75859d2a9ddf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:49 np0005588919 NetworkManager[49104]: <info>  [1768921309.8149] manager: (tap244332ba-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.819 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.820 225859 INFO os_vif [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b')#033[00m
Jan 20 10:01:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:49.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.898 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.899 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.899 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No VIF found with MAC fa:16:3e:6f:36:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.899 225859 INFO nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Using config drive#033[00m
Jan 20 10:01:49 np0005588919 nova_compute[225855]: 2026-01-20 15:01:49.923 225859 DEBUG nova.storage.rbd_utils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.366 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.367 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.368 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:50.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.629 225859 DEBUG nova.network.neutron [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.653 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Releasing lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.654 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance network_info: |[{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.654 225859 DEBUG oslo_concurrency.lockutils [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.655 225859 DEBUG nova.network.neutron [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.658 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Start _get_guest_xml network_info=[{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.665 225859 WARNING nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.670 225859 DEBUG nova.virt.libvirt.host [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.671 225859 DEBUG nova.virt.libvirt.host [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.680 225859 DEBUG nova.virt.libvirt.host [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.680 225859 DEBUG nova.virt.libvirt.host [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.681 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.682 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.682 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.682 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.683 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.683 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.683 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.683 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.683 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.684 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.684 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.684 225859 DEBUG nova.virt.hardware [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.687 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:01:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3373454363' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.803 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.836 225859 INFO nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Creating config drive at /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.842 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvpwqoz6p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.894 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.895 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:01:50 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.976 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvpwqoz6p" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:50.999 225859 DEBUG nova.storage.rbd_utils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.002 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.118 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.120 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4311MB free_disk=20.897380828857422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.120 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.121 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:01:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2575874763' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.145 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.168 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.172 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.197 225859 DEBUG oslo_concurrency.processutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.198 225859 INFO nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Deleting local config drive /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config because it was imported into RBD.#033[00m
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.2472] manager: (tap244332ba-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.247 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 474cec75-3b01-411a-9074-75859d2a9ddf actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.247 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.248 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.248 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:01:51 np0005588919 kernel: tap244332ba-1b: entered promiscuous mode
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:51Z|00611|binding|INFO|Claiming lport 244332ba-1b58-4d42-98b0-245f9460c50f for this chassis.
Jan 20 10:01:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:51Z|00612|binding|INFO|244332ba-1b58-4d42-98b0-245f9460c50f: Claiming fa:16:3e:6f:36:24 10.100.0.4
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.260 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.268 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.269 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec bound to our chassis#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.271 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:01:51 np0005588919 systemd-machined[194361]: New machine qemu-72-instance-00000096.
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.290 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c35fa35e-a517-484c-bc39-a153017c50c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.290 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap671e28d0-01 in ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.292 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap671e28d0-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.292 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d60af356-92cd-4c0f-a067-5257859fa9ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.293 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[85bb7cfd-85ec-44f4-a337-f3968852c99d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 systemd[1]: Started Virtual Machine qemu-72-instance-00000096.
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.305 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[aaaa6497-d0c9-4e02-8084-89dc648d7ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:51Z|00613|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f ovn-installed in OVS
Jan 20 10:01:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:51Z|00614|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f up in Southbound
Jan 20 10:01:51 np0005588919 systemd-udevd[285562]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.333 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee3a330-a29b-461f-8368-7c2e48a598a9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.336 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.3492] device (tap244332ba-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.3499] device (tap244332ba-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.376 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[72b66347-e393-42d0-8c90-fe7cf75d0aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.376 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.382 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19cb985a-867d-4650-88ee-2a13d9f3fe9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.3832] manager: (tap671e28d0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.421 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e65acb-9db3-4634-850d-76b483aadeec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.424 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c01aab0c-7532-4d57-aadb-7e3e7123376c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.4457] device (tap671e28d0-00): carrier: link connected
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.451 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6cba133a-e9c1-4e5f-9575-ca4478f82c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.467 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ec9eb4c9-9ed4-4fb0-bca0-6206d4a45e03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635037, 'reachable_time': 30729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285595, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.481 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef09764-ac6e-4e3a-825f-2dfc24dd8625]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:4e69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635037, 'tstamp': 635037}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285596, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.496 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0a047f33-f554-4486-a6c0-31ff79179bd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635037, 'reachable_time': 30729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285598, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.524 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3a5f4f-6eb6-4479-8673-4e7bfbb287e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.529 225859 DEBUG nova.network.neutron [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updated VIF entry in instance network info cache for port 244332ba-1b58-4d42-98b0-245f9460c50f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.531 225859 DEBUG nova.network.neutron [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.561 225859 DEBUG oslo_concurrency.lockutils [req-caaf922f-48c2-4bec-a8fb-33fc70f25712 req-2f67e3bb-0a0e-44ee-b55e-0afae93f40cf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.587 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5ed1a5-5f5a-460b-9d86-384f6e5043a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.589 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.589 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.590 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:01:51 np0005588919 kernel: tap671e28d0-00: entered promiscuous mode
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.6376] manager: (tap671e28d0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 20 10:01:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/120102328' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.640 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.641 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:51Z|00615|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.660 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.660 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.661 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3e5480-4f76-4bdd-a24f-d53ad9a2b5f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.662 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:01:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:51.663 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'env', 'PROCESS_TAG=haproxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.672 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.674 225859 DEBUG nova.virt.libvirt.vif [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:01:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2070424486',display_name='tempest-TestSnapshotPattern-server-2070424486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2070424486',id=151,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-4u8oxks9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:45Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.674 225859 DEBUG nova.network.os_vif_util [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.675 225859 DEBUG nova.network.os_vif_util [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.677 225859 DEBUG nova.objects.instance [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.701 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <uuid>2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1</uuid>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <name>instance-00000097</name>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestSnapshotPattern-server-2070424486</nova:name>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:01:50</nova:creationTime>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:user uuid="1654794111844ca88666b3529173e9a7">tempest-TestSnapshotPattern-1341092631-project-member</nova:user>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:project uuid="3a1d679d5c954662a271e842fe2f2c05">tempest-TestSnapshotPattern-1341092631</nova:project>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <nova:port uuid="6216baae-337d-44a3-aa38-60c2afb5d13f">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="serial">2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="uuid">2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:87:b9:ea"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <target dev="tap6216baae-33"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/console.log" append="off"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:01:51 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:01:51 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:01:51 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:01:51 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.708 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Preparing to wait for external event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.709 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.709 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.709 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.710 225859 DEBUG nova.virt.libvirt.vif [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:01:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2070424486',display_name='tempest-TestSnapshotPattern-server-2070424486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2070424486',id=151,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-4u8oxks9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:45Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.711 225859 DEBUG nova.network.os_vif_util [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.712 225859 DEBUG nova.network.os_vif_util [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.712 225859 DEBUG os_vif [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.713 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.713 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.714 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.717 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6216baae-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.717 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6216baae-33, col_values=(('external_ids', {'iface-id': '6216baae-337d-44a3-aa38-60c2afb5d13f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:b9:ea', 'vm-uuid': '2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 NetworkManager[49104]: <info>  [1768921311.7197] manager: (tap6216baae-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.724 225859 INFO os_vif [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33')#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.751 225859 DEBUG nova.compute.manager [req-7c410de6-4154-4108-a488-cda6e89a65b3 req-cfec6579-19b4-4958-b792-98efc3293a0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.753 225859 DEBUG oslo_concurrency.lockutils [req-7c410de6-4154-4108-a488-cda6e89a65b3 req-cfec6579-19b4-4958-b792-98efc3293a0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.753 225859 DEBUG oslo_concurrency.lockutils [req-7c410de6-4154-4108-a488-cda6e89a65b3 req-cfec6579-19b4-4958-b792-98efc3293a0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.753 225859 DEBUG oslo_concurrency.lockutils [req-7c410de6-4154-4108-a488-cda6e89a65b3 req-cfec6579-19b4-4958-b792-98efc3293a0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.754 225859 DEBUG nova.compute.manager [req-7c410de6-4154-4108-a488-cda6e89a65b3 req-cfec6579-19b4-4958-b792-98efc3293a0b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Processing event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.818 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.826 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:01:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:51.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.859 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.867 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.873 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.873 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.874 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No VIF found with MAC fa:16:3e:87:b9:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.874 225859 INFO nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Using config drive#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.908 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.916 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:01:51 np0005588919 nova_compute[225855]: 2026-01-20 15:01:51.916 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:52 np0005588919 podman[285672]: 2026-01-20 15:01:52.041055805 +0000 UTC m=+0.051337249 container create c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:01:52 np0005588919 systemd[1]: Started libpod-conmon-c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84.scope.
Jan 20 10:01:52 np0005588919 podman[285672]: 2026-01-20 15:01:52.015733101 +0000 UTC m=+0.026014575 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:01:52 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:01:52 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9583a611064af7f6e72be93b7a47c8363e51f876af054788b7c0ed954b9e3b3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:01:52 np0005588919 podman[285672]: 2026-01-20 15:01:52.135674755 +0000 UTC m=+0.145956209 container init c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:01:52 np0005588919 podman[285672]: 2026-01-20 15:01:52.141248412 +0000 UTC m=+0.151529866 container start c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:01:52 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [NOTICE]   (285691) : New worker (285693) forked
Jan 20 10:01:52 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [NOTICE]   (285691) : Loading success.
Jan 20 10:01:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:52.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.646 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.648 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921312.646165, 474cec75-3b01-411a-9074-75859d2a9ddf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.648 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.652 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.655 225859 INFO nova.virt.libvirt.driver [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance spawned successfully.#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.655 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.675 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.681 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.684 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.684 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.685 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.685 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.686 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.686 225859 DEBUG nova.virt.libvirt.driver [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.714 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.715 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921312.647177, 474cec75-3b01-411a-9074-75859d2a9ddf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.715 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.728 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:52.729 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:52.730 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.764 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.768 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921312.6512923, 474cec75-3b01-411a-9074-75859d2a9ddf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.768 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.772 225859 INFO nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Took 4.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.772 225859 DEBUG nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.825 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.828 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.841 225859 INFO nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Creating config drive at /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.847 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28zsa7s7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.882 225859 INFO nova.compute.manager [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Took 8.43 seconds to build instance.#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.908 225859 DEBUG nova.network.neutron [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updated VIF entry in instance network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.909 225859 DEBUG nova.network.neutron [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.917 225859 DEBUG oslo_concurrency.lockutils [None req-de5e8f40-ac38-43be-94d3-f6d29767939e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.928 225859 DEBUG oslo_concurrency.lockutils [req-9a937e69-f6fd-46bc-9bbf-4370a0ca8765 req-61a44dde-7864-4631-8912-f6885a52cef2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:01:52 np0005588919 nova_compute[225855]: 2026-01-20 15:01:52.979 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp28zsa7s7" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.005 225859 DEBUG nova.storage.rbd_utils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] rbd image 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.011 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:01:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:01:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2483513834' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:01:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:01:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2483513834' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.371 225859 DEBUG oslo_concurrency.processutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.359s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.372 225859 INFO nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Deleting local config drive /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1/disk.config because it was imported into RBD.#033[00m
Jan 20 10:01:53 np0005588919 kernel: tap6216baae-33: entered promiscuous mode
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.4277] manager: (tap6216baae-33): new Tun device (/org/freedesktop/NetworkManager/Devices/266)
Jan 20 10:01:53 np0005588919 systemd-udevd[285590]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:01:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:53Z|00616|binding|INFO|Claiming lport 6216baae-337d-44a3-aa38-60c2afb5d13f for this chassis.
Jan 20 10:01:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:53Z|00617|binding|INFO|6216baae-337d-44a3-aa38-60c2afb5d13f: Claiming fa:16:3e:87:b9:ea 10.100.0.12
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.439 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b9:ea 10.100.0.12'], port_security=['fa:16:3e:87:b9:ea 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a1d679d5c954662a271e842fe2f2c05', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f11f0ae2-6b78-4d57-a9ea-5a7c52439262', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=773a665f-440e-445e-8ca6-20a8b67e017a, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6216baae-337d-44a3-aa38-60c2afb5d13f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.440 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6216baae-337d-44a3-aa38-60c2afb5d13f in datapath 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad bound to our chassis#033[00m
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.4420] device (tap6216baae-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.4437] device (tap6216baae-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.442 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.454 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62dc41d6-e889-498a-ac25-a0c5aedb243c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.454 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap43d3be8f-91 in ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.456 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap43d3be8f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.456 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff27907-c7b1-49ae-9871-a0e06379458f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.457 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[30c16653-92f6-454a-bb43-1e9306e68414]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 systemd-machined[194361]: New machine qemu-73-instance-00000097.
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.469 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ccbf268c-2b3b-4387-a301-9ce44c22c0a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 systemd[1]: Started Virtual Machine qemu-73-instance-00000097.
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.492 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cda80882-3d91-4356-b3e7-596bcb6a6bd7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.502 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:53 np0005588919 podman[285785]: 2026-01-20 15:01:53.505610173 +0000 UTC m=+0.100210888 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Jan 20 10:01:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:53Z|00618|binding|INFO|Setting lport 6216baae-337d-44a3-aa38-60c2afb5d13f ovn-installed in OVS
Jan 20 10:01:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:53Z|00619|binding|INFO|Setting lport 6216baae-337d-44a3-aa38-60c2afb5d13f up in Southbound
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.507 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.521 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[333e9fa2-a0bd-4deb-9f9c-e8d1da93c7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.527 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f616764e-ce11-4c90-8916-30011c814a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.5281] manager: (tap43d3be8f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/267)
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.553 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a931b3b2-d0af-48ed-aa37-d60b1dd822da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.556 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c064ae-e4ef-4791-b977-f776965bda85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.5774] device (tap43d3be8f-90): carrier: link connected
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.582 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb4ae05-029a-4287-be41-7ba79d3f0ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.598 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b9707558-b6ab-4857-b8dc-191ba160e05d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43d3be8f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:0f:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635250, 'reachable_time': 42674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285839, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.613 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b2dce26e-757e-4827-9cc0-effe7210a464]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:f60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635250, 'tstamp': 635250}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285840, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.633 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e2310d5a-4b73-42b7-821b-67a07e4900f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43d3be8f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:0f:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635250, 'reachable_time': 42674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285841, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.667 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3676eb-fc98-41f8-b886-db4858a3f3c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.726 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6b22ca45-0f00-48ba-86de-edb90bf97fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.727 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43d3be8f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.727 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.728 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43d3be8f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:53 np0005588919 NetworkManager[49104]: <info>  [1768921313.7317] manager: (tap43d3be8f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Jan 20 10:01:53 np0005588919 kernel: tap43d3be8f-90: entered promiscuous mode
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.732 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.733 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43d3be8f-90, col_values=(('external_ids', {'iface-id': '32afa112-2ec4-4d59-b6eb-a77db2858bd4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:01:53Z|00620|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.751 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f07ac62f-6b1e-4e0f-873f-d94a7414a6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.753 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.pid.haproxy
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:01:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:53.755 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'env', 'PROCESS_TAG=haproxy-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/43d3be8f-9be1-4892-bbfe-d0ba2d7157ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:01:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:53.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.856 225859 DEBUG nova.compute.manager [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.857 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.857 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.857 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.857 225859 DEBUG nova.compute.manager [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.858 225859 WARNING nova.compute.manager [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.858 225859 DEBUG nova.compute.manager [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.858 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.858 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.859 225859 DEBUG oslo_concurrency.lockutils [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:53 np0005588919 nova_compute[225855]: 2026-01-20 15:01:53.859 225859 DEBUG nova.compute.manager [req-d0939939-3372-4503-a492-d19bf543f0f5 req-9e42d97d-b2bc-49f3-948f-0ff2ef99cd8d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Processing event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:01:54 np0005588919 podman[285874]: 2026-01-20 15:01:54.11805894 +0000 UTC m=+0.049114656 container create 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:01:54 np0005588919 systemd[1]: Started libpod-conmon-858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70.scope.
Jan 20 10:01:54 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:01:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a768a9de3a5bcf02ee70e03b9ac0d04f5f61e6aa1b57b9c32ed35cc799999e46/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:01:54 np0005588919 podman[285874]: 2026-01-20 15:01:54.09392861 +0000 UTC m=+0.024984346 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:01:54 np0005588919 podman[285874]: 2026-01-20 15:01:54.203942433 +0000 UTC m=+0.134998169 container init 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 10:01:54 np0005588919 podman[285874]: 2026-01-20 15:01:54.211583619 +0000 UTC m=+0.142639345 container start 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:01:54 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [NOTICE]   (285893) : New worker (285895) forked
Jan 20 10:01:54 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [NOTICE]   (285893) : Loading success.
Jan 20 10:01:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.573 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921314.5728111, 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.575 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.578 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.584 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.588 225859 INFO nova.virt.libvirt.driver [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance spawned successfully.#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.589 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.599 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.605 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.610 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.610 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.611 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.611 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.611 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.612 225859 DEBUG nova.virt.libvirt.driver [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.627 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.627 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921314.5729702, 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.627 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.655 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.659 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921314.5828328, 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.659 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.677 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.680 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.693 225859 INFO nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Took 8.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.693 225859 DEBUG nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.722 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.785 225859 INFO nova.compute.manager [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Took 9.84 seconds to build instance.#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.813 225859 DEBUG oslo_concurrency.lockutils [None req-a557ce2c-7da9-48a8-8108-5744558daadb 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:54 np0005588919 nova_compute[225855]: 2026-01-20 15:01:54.911 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:55.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:01:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:01:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:56.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.720 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.778 225859 DEBUG nova.compute.manager [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.778 225859 DEBUG oslo_concurrency.lockutils [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.779 225859 DEBUG oslo_concurrency.lockutils [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.779 225859 DEBUG oslo_concurrency.lockutils [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.779 225859 DEBUG nova.compute.manager [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] No waiting events found dispatching network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.780 225859 WARNING nova.compute.manager [req-a2d79d48-0e15-4bd9-b3b1-3e6e70fbfb25 req-87cc4941-192f-4471-b3cc-949d4b17a459 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received unexpected event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f for instance with vm_state active and task_state None.#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.856 225859 INFO nova.compute.manager [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Rescuing#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.857 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.858 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.858 225859 DEBUG nova.network.neutron [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:01:56 np0005588919 nova_compute[225855]: 2026-01-20 15:01:56.867 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:01:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:01:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:01:58.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:01:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:01:58.732 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:01:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:01:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:01:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:01:59.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:00 np0005588919 nova_compute[225855]: 2026-01-20 15:02:00.041 225859 DEBUG nova.network.neutron [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:00 np0005588919 nova_compute[225855]: 2026-01-20 15:02:00.063 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:00.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:00 np0005588919 nova_compute[225855]: 2026-01-20 15:02:00.418 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:02:01 np0005588919 nova_compute[225855]: 2026-01-20 15:02:01.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:01 np0005588919 nova_compute[225855]: 2026-01-20 15:02:01.772 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:01 np0005588919 NetworkManager[49104]: <info>  [1768921321.7731] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 20 10:02:01 np0005588919 NetworkManager[49104]: <info>  [1768921321.7740] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 20 10:02:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:01.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:01 np0005588919 nova_compute[225855]: 2026-01-20 15:02:01.921 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:01 np0005588919 nova_compute[225855]: 2026-01-20 15:02:01.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:01 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:01Z|00621|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:01 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:01Z|00622|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:02:01 np0005588919 nova_compute[225855]: 2026-01-20 15:02:01.947 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:02 np0005588919 nova_compute[225855]: 2026-01-20 15:02:02.137 225859 DEBUG nova.compute.manager [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:02 np0005588919 nova_compute[225855]: 2026-01-20 15:02:02.137 225859 DEBUG nova.compute.manager [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing instance network info cache due to event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:02 np0005588919 nova_compute[225855]: 2026-01-20 15:02:02.137 225859 DEBUG oslo_concurrency.lockutils [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:02 np0005588919 nova_compute[225855]: 2026-01-20 15:02:02.137 225859 DEBUG oslo_concurrency.lockutils [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:02 np0005588919 nova_compute[225855]: 2026-01-20 15:02:02.138 225859 DEBUG nova.network.neutron [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:03.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:04.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:04 np0005588919 nova_compute[225855]: 2026-01-20 15:02:04.515 225859 DEBUG nova.network.neutron [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updated VIF entry in instance network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:04 np0005588919 nova_compute[225855]: 2026-01-20 15:02:04.516 225859 DEBUG nova.network.neutron [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:04Z|00623|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:04Z|00624|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:02:04 np0005588919 nova_compute[225855]: 2026-01-20 15:02:04.565 225859 DEBUG oslo_concurrency.lockutils [req-313b9c23-99a1-49e6-8300-b14e5e036c76 req-8f48e61e-58ab-4153-b2e8-67936b63ae4c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:04 np0005588919 nova_compute[225855]: 2026-01-20 15:02:04.567 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:05.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:06 np0005588919 podman[286004]: 2026-01-20 15:02:06.016702402 +0000 UTC m=+0.053405957 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 20 10:02:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:06.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:06 np0005588919 nova_compute[225855]: 2026-01-20 15:02:06.774 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:06 np0005588919 nova_compute[225855]: 2026-01-20 15:02:06.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:06 np0005588919 nova_compute[225855]: 2026-01-20 15:02:06.992 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:06 np0005588919 nova_compute[225855]: 2026-01-20 15:02:06.992 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.009 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.072 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.073 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.087 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.088 225859 INFO nova.compute.claims [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.237 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2601699422' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.670 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.676 225859 DEBUG nova.compute.provider_tree [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.695 225859 DEBUG nova.scheduler.client.report [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.725 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.726 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.790 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.790 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.820 225859 INFO nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.841 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:02:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:07.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.929 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.930 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.931 225859 INFO nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Creating image(s)#033[00m
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.960 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:07 np0005588919 nova_compute[225855]: 2026-01-20 15:02:07.988 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.013 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.017 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.089 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.091 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.092 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.092 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.122 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.126 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:08.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.697 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.760 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] resizing rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.864 225859 DEBUG nova.objects.instance [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'migration_context' on Instance uuid 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.879 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.880 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Ensure instance console log exists: /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.880 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.881 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:08 np0005588919 nova_compute[225855]: 2026-01-20 15:02:08.881 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:09Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:b9:ea 10.100.0.12
Jan 20 10:02:09 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:09Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:b9:ea 10.100.0.12
Jan 20 10:02:09 np0005588919 nova_compute[225855]: 2026-01-20 15:02:09.715 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Successfully created port: d9897519-3517-45da-be53-d342192fa380 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:02:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:09.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:10.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.462 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.723 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Successfully updated port: d9897519-3517-45da-be53-d342192fa380 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.742 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.743 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquired lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.743 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.871 225859 DEBUG nova.compute.manager [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Received event network-changed-d9897519-3517-45da-be53-d342192fa380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.871 225859 DEBUG nova.compute.manager [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Refreshing instance network info cache due to event network-changed-d9897519-3517-45da-be53-d342192fa380. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:02:10 np0005588919 nova_compute[225855]: 2026-01-20 15:02:10.872 225859 DEBUG oslo_concurrency.lockutils [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:11 np0005588919 nova_compute[225855]: 2026-01-20 15:02:11.572 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:02:11 np0005588919 nova_compute[225855]: 2026-01-20 15:02:11.778 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:11.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:11 np0005588919 nova_compute[225855]: 2026-01-20 15:02:11.982 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:12.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.379 225859 DEBUG nova.network.neutron [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Updating instance_info_cache with network_info: [{"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.411 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Releasing lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.411 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Instance network_info: |[{"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.412 225859 DEBUG oslo_concurrency.lockutils [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.412 225859 DEBUG nova.network.neutron [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Refreshing network info cache for port d9897519-3517-45da-be53-d342192fa380 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.415 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Start _get_guest_xml network_info=[{"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.420 225859 WARNING nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.426 225859 DEBUG nova.virt.libvirt.host [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.427 225859 DEBUG nova.virt.libvirt.host [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.430 225859 DEBUG nova.virt.libvirt.host [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.431 225859 DEBUG nova.virt.libvirt.host [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.432 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.432 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.432 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.433 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.433 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.433 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.433 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.434 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.434 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.434 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.434 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.435 225859 DEBUG nova.virt.hardware [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.437 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:13.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3696145664' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.959 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:13 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.995 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:13.999 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:14.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/295816442' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.437 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.439 225859 DEBUG nova.virt.libvirt.vif [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1200545801',display_name='tempest-TestServerMultinode-server-1200545801',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1200545801',id=154,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-nu0b3t9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:07Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=0a74eb9c-7f01-437d-a0c8-c01696fc8f9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.439 225859 DEBUG nova.network.os_vif_util [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.440 225859 DEBUG nova.network.os_vif_util [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.441 225859 DEBUG nova.objects.instance [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.460 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <uuid>0a74eb9c-7f01-437d-a0c8-c01696fc8f9d</uuid>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <name>instance-0000009a</name>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestServerMultinode-server-1200545801</nova:name>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:02:13</nova:creationTime>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:user uuid="158563a99d4a420890aaa00b05c8bb57">tempest-TestServerMultinode-1071973011-project-admin</nova:user>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:project uuid="654b3ce7b3644fc58f8dc9f60529320b">tempest-TestServerMultinode-1071973011</nova:project>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <nova:port uuid="d9897519-3517-45da-be53-d342192fa380">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="serial">0a74eb9c-7f01-437d-a0c8-c01696fc8f9d</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="uuid">0a74eb9c-7f01-437d-a0c8-c01696fc8f9d</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:7a:5f:fd"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <target dev="tapd9897519-35"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/console.log" append="off"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:02:14 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:02:14 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:02:14 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:02:14 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.460 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Preparing to wait for external event network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.461 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.461 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.461 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.462 225859 DEBUG nova.virt.libvirt.vif [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1200545801',display_name='tempest-TestServerMultinode-server-1200545801',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1200545801',id=154,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-nu0b3t9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:02:07Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=0a74eb9c-7f01-437d-a0c8-c01696fc8f9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.462 225859 DEBUG nova.network.os_vif_util [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.463 225859 DEBUG nova.network.os_vif_util [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.463 225859 DEBUG os_vif [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.467 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.467 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.467 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.471 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9897519-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.472 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd9897519-35, col_values=(('external_ids', {'iface-id': 'd9897519-3517-45da-be53-d342192fa380', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:5f:fd', 'vm-uuid': '0a74eb9c-7f01-437d-a0c8-c01696fc8f9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:14 np0005588919 NetworkManager[49104]: <info>  [1768921334.4747] manager: (tapd9897519-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.483 225859 INFO os_vif [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35')#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.603 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.603 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.604 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] No VIF found with MAC fa:16:3e:7a:5f:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.604 225859 INFO nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Using config drive#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.636 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:14 np0005588919 podman[286450]: 2026-01-20 15:02:14.638047526 +0000 UTC m=+0.085607856 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.739 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:14 np0005588919 podman[286450]: 2026-01-20 15:02:14.754346077 +0000 UTC m=+0.201906377 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.868 225859 DEBUG nova.network.neutron [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Updated VIF entry in instance network info cache for port d9897519-3517-45da-be53-d342192fa380. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.868 225859 DEBUG nova.network.neutron [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Updating instance_info_cache with network_info: [{"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:14 np0005588919 nova_compute[225855]: 2026-01-20 15:02:14.886 225859 DEBUG oslo_concurrency.lockutils [req-4004eee4-5a31-423e-8a25-56d4197964f0 req-8aacd456-145e-49ed-9d29-a40ed8f1ce23 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.131 225859 INFO nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Creating config drive at /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.136 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp57ra_zsl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.270 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp57ra_zsl" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.305 225859 DEBUG nova.storage.rbd_utils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] rbd image 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.312 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:15 np0005588919 podman[286633]: 2026-01-20 15:02:15.362708299 +0000 UTC m=+0.057535215 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:02:15 np0005588919 podman[286665]: 2026-01-20 15:02:15.434547685 +0000 UTC m=+0.052411089 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:02:15 np0005588919 podman[286633]: 2026-01-20 15:02:15.441247794 +0000 UTC m=+0.136074700 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.539 225859 DEBUG oslo_concurrency.processutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.540 225859 INFO nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Deleting local config drive /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d/disk.config because it was imported into RBD.#033[00m
Jan 20 10:02:15 np0005588919 kernel: tapd9897519-35: entered promiscuous mode
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.5935] manager: (tapd9897519-35): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Jan 20 10:02:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:15Z|00625|binding|INFO|Claiming lport d9897519-3517-45da-be53-d342192fa380 for this chassis.
Jan 20 10:02:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:15Z|00626|binding|INFO|d9897519-3517-45da-be53-d342192fa380: Claiming fa:16:3e:7a:5f:fd 10.100.0.8
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.597 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.607 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:5f:fd 10.100.0.8'], port_security=['fa:16:3e:7a:5f:fd 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0a74eb9c-7f01-437d-a0c8-c01696fc8f9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '654b3ce7b3644fc58f8dc9f60529320b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ff1c5b6a-5ab6-401e-b333-7f359193e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a3d5928-255d-4c0c-af70-f26be5196416, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d9897519-3517-45da-be53-d342192fa380) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.608 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d9897519-3517-45da-be53-d342192fa380 in datapath 0296a21f-6ec4-43a7-8731-1d3692a5de4a bound to our chassis#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.610 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0296a21f-6ec4-43a7-8731-1d3692a5de4a#033[00m
Jan 20 10:02:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:15Z|00627|binding|INFO|Setting lport d9897519-3517-45da-be53-d342192fa380 ovn-installed in OVS
Jan 20 10:02:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:15Z|00628|binding|INFO|Setting lport d9897519-3517-45da-be53-d342192fa380 up in Southbound
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.625 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c18260e8-537b-4ba6-aed5-3ee695d04ce5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.625 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0296a21f-61 in ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.627 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0296a21f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.627 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[97389464-2adb-483a-988e-b9da3ce46685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.628 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2e07438e-a020-4b9f-8fc1-1c4595d10dcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 systemd-udevd[286790]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:02:15 np0005588919 systemd-machined[194361]: New machine qemu-74-instance-0000009a.
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.640 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[34627889-b4e2-43f5-b432-abbbef51ca41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.6482] device (tapd9897519-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.6492] device (tapd9897519-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:02:15 np0005588919 systemd[1]: Started Virtual Machine qemu-74-instance-0000009a.
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.666 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99c7638f-b672-4f08-8c2c-8b223a636b4a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.694 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3fba78a1-6fed-4eef-87b9-51d458270c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.700 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[439ed9da-6dd1-4a71-b52e-8d96d578ea82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.7012] manager: (tap0296a21f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Jan 20 10:02:15 np0005588919 podman[286780]: 2026-01-20 15:02:15.705297814 +0000 UTC m=+0.075051899 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, version=2.2.4, io.openshift.tags=Ceph keepalived, distribution-scope=public, release=1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.733 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e4ae46-22db-47c1-9875-441c9c24ce59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.736 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3459ccfc-0f4f-4e98-a323-7e1aa525197b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 podman[286780]: 2026-01-20 15:02:15.747546986 +0000 UTC m=+0.117301061 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, distribution-scope=public, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.openshift.expose-services=)
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.7639] device (tap0296a21f-60): carrier: link connected
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.770 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9949f421-0e25-4148-8c78-ca53df948026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.790 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c95526a3-8f22-4bbe-9847-dcc9dfcaabfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0296a21f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:1c:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637469, 'reachable_time': 44936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286846, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.808 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19f646ca-58ff-4d95-8183-888b0aee162c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:1c68'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637469, 'tstamp': 637469}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286847, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.827 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f89cf3d1-12e7-4278-8afd-7d630919e20d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0296a21f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:1c:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637469, 'reachable_time': 44936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286848, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.859 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ae739682-f2f3-4825-84ff-07abfabf369f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:15.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.916 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a41f7a57-ad33-4b24-b228-2d954f974e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.920 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0296a21f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.920 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.921 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0296a21f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:15 np0005588919 kernel: tap0296a21f-60: entered promiscuous mode
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.922 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 NetworkManager[49104]: <info>  [1768921335.9239] manager: (tap0296a21f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.926 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0296a21f-60, col_values=(('external_ids', {'iface-id': 'a6fccd00-2fdb-4d49-8d76-4860c81e4a5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.928 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:15Z|00629|binding|INFO|Releasing lport a6fccd00-2fdb-4d49-8d76-4860c81e4a5f from this chassis (sb_readonly=0)
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.944 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.945 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.946 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f451004-0463-41c4-8bed-7814188c7799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.947 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-0296a21f-6ec4-43a7-8731-1d3692a5de4a
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/0296a21f-6ec4-43a7-8731-1d3692a5de4a.pid.haproxy
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 0296a21f-6ec4-43a7-8731-1d3692a5de4a
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:02:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:15.948 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'env', 'PROCESS_TAG=haproxy-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0296a21f-6ec4-43a7-8731-1d3692a5de4a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.998 225859 DEBUG nova.compute.manager [req-4f27abd5-f123-410c-8c71-c1bce10dde58 req-73e39e3a-2b6d-4748-a151-f8dc15115a6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Received event network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.998 225859 DEBUG oslo_concurrency.lockutils [req-4f27abd5-f123-410c-8c71-c1bce10dde58 req-73e39e3a-2b6d-4748-a151-f8dc15115a6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.999 225859 DEBUG oslo_concurrency.lockutils [req-4f27abd5-f123-410c-8c71-c1bce10dde58 req-73e39e3a-2b6d-4748-a151-f8dc15115a6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.999 225859 DEBUG oslo_concurrency.lockutils [req-4f27abd5-f123-410c-8c71-c1bce10dde58 req-73e39e3a-2b6d-4748-a151-f8dc15115a6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:15 np0005588919 nova_compute[225855]: 2026-01-20 15:02:15.999 225859 DEBUG nova.compute.manager [req-4f27abd5-f123-410c-8c71-c1bce10dde58 req-73e39e3a-2b6d-4748-a151-f8dc15115a6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Processing event network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.094 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.096 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921336.0955298, 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.096 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] VM Started (Lifecycle Event)#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.100 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.105 225859 INFO nova.virt.libvirt.driver [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Instance spawned successfully.#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.105 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.150 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.150 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.151 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.151 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.152 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.152 225859 DEBUG nova.virt.libvirt.driver [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.164 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.168 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.201 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.202 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921336.0957355, 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.209 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.237 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.243 225859 INFO nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Took 8.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.244 225859 DEBUG nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.245 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921336.0983849, 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.245 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.279 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.283 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.319 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.321 225859 INFO nova.compute.manager [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Took 9.27 seconds to build instance.#033[00m
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.345 225859 DEBUG oslo_concurrency.lockutils [None req-4994abe5-feb5-4df3-8b7a-ac887128508a 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:16 np0005588919 podman[287033]: 2026-01-20 15:02:16.371735945 +0000 UTC m=+0.077153557 container create 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:02:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:16.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:16.422 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:16.422 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:16.423 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:16 np0005588919 podman[287033]: 2026-01-20 15:02:16.328107285 +0000 UTC m=+0.033524917 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:02:16 np0005588919 systemd[1]: Started libpod-conmon-2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7.scope.
Jan 20 10:02:16 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:02:16 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b849f5b9085b963f8484cdd30046286b99fb772010029aac19f41105e734dffc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:02:16 np0005588919 podman[287033]: 2026-01-20 15:02:16.477052407 +0000 UTC m=+0.182470039 container init 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:02:16 np0005588919 podman[287033]: 2026-01-20 15:02:16.482626164 +0000 UTC m=+0.188043776 container start 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:02:16 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [NOTICE]   (287056) : New worker (287060) forked
Jan 20 10:02:16 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [NOTICE]   (287056) : Loading success.
Jan 20 10:02:16 np0005588919 nova_compute[225855]: 2026-01-20 15:02:16.984 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:02:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:17 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:02:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:17.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e333 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.123 225859 DEBUG nova.compute.manager [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Received event network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.124 225859 DEBUG oslo_concurrency.lockutils [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.125 225859 DEBUG oslo_concurrency.lockutils [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.125 225859 DEBUG oslo_concurrency.lockutils [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.126 225859 DEBUG nova.compute.manager [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] No waiting events found dispatching network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.126 225859 WARNING nova.compute.manager [req-e08ff055-5bec-44fa-8e5c-ccc422962091 req-a32039a5-6bd8-49b2-80b2-58406b13dfb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Received unexpected event network-vif-plugged-d9897519-3517-45da-be53-d342192fa380 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:02:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:18.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.795 225859 DEBUG nova.compute.manager [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:18 np0005588919 nova_compute[225855]: 2026-01-20 15:02:18.859 225859 INFO nova.compute.manager [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] instance snapshotting#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.042 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.043 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.043 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.043 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.043 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.044 225859 INFO nova.compute.manager [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Terminating instance#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.046 225859 DEBUG nova.compute.manager [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.465 225859 INFO nova.virt.libvirt.driver [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Beginning live snapshot process#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 kernel: tapd9897519-35 (unregistering): left promiscuous mode
Jan 20 10:02:19 np0005588919 NetworkManager[49104]: <info>  [1768921339.5113] device (tapd9897519-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:02:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:19Z|00630|binding|INFO|Releasing lport d9897519-3517-45da-be53-d342192fa380 from this chassis (sb_readonly=0)
Jan 20 10:02:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:19Z|00631|binding|INFO|Setting lport d9897519-3517-45da-be53-d342192fa380 down in Southbound
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.567 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:19Z|00632|binding|INFO|Removing iface tapd9897519-35 ovn-installed in OVS
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.569 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:19.574 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:5f:fd 10.100.0.8'], port_security=['fa:16:3e:7a:5f:fd 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0a74eb9c-7f01-437d-a0c8-c01696fc8f9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '654b3ce7b3644fc58f8dc9f60529320b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ff1c5b6a-5ab6-401e-b333-7f359193e012', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a3d5928-255d-4c0c-af70-f26be5196416, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d9897519-3517-45da-be53-d342192fa380) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:19.575 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d9897519-3517-45da-be53-d342192fa380 in datapath 0296a21f-6ec4-43a7-8731-1d3692a5de4a unbound from our chassis#033[00m
Jan 20 10:02:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:19.577 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0296a21f-6ec4-43a7-8731-1d3692a5de4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:02:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:19.578 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7357c450-9ff7-49f5-b4c7-9a6099266654]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:19.578 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a namespace which is not needed anymore#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.584 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Jan 20 10:02:19 np0005588919 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009a.scope: Consumed 3.440s CPU time.
Jan 20 10:02:19 np0005588919 systemd-machined[194361]: Machine qemu-74-instance-0000009a terminated.
Jan 20 10:02:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:19.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.892 225859 DEBUG nova.virt.libvirt.imagebackend [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.899 225859 INFO nova.virt.libvirt.driver [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Instance destroyed successfully.#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.899 225859 DEBUG nova.objects.instance [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lazy-loading 'resources' on Instance uuid 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.923 225859 DEBUG nova.virt.libvirt.vif [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:02:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1200545801',display_name='tempest-TestServerMultinode-server-1200545801',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1200545801',id=154,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:02:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='654b3ce7b3644fc58f8dc9f60529320b',ramdisk_id='',reservation_id='r-nu0b3t9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1071973011',owner_user_name='tempest-TestServerMultinode-1071973011-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:02:16Z,user_data=None,user_id='158563a99d4a420890aaa00b05c8bb57',uuid=0a74eb9c-7f01-437d-a0c8-c01696fc8f9d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.923 225859 DEBUG nova.network.os_vif_util [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converting VIF {"id": "d9897519-3517-45da-be53-d342192fa380", "address": "fa:16:3e:7a:5f:fd", "network": {"id": "0296a21f-6ec4-43a7-8731-1d3692a5de4a", "bridge": "br-int", "label": "tempest-TestServerMultinode-1878354210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "908b5ba217ab458e8c9aa0e5a471c194", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9897519-35", "ovs_interfaceid": "d9897519-3517-45da-be53-d342192fa380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.924 225859 DEBUG nova.network.os_vif_util [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.924 225859 DEBUG os_vif [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.926 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.926 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9897519-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:19 np0005588919 nova_compute[225855]: 2026-01-20 15:02:19.932 225859 INFO os_vif [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5f:fd,bridge_name='br-int',has_traffic_filtering=True,id=d9897519-3517-45da-be53-d342192fa380,network=Network(0296a21f-6ec4-43a7-8731-1d3692a5de4a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9897519-35')#033[00m
Jan 20 10:02:20 np0005588919 nova_compute[225855]: 2026-01-20 15:02:20.097 225859 DEBUG nova.storage.rbd_utils [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] creating snapshot(ff4322e85df1493480d9bf54ecc676ab) on rbd image(2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:02:20 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [NOTICE]   (287056) : haproxy version is 2.8.14-c23fe91
Jan 20 10:02:20 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [NOTICE]   (287056) : path to executable is /usr/sbin/haproxy
Jan 20 10:02:20 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [WARNING]  (287056) : Exiting Master process...
Jan 20 10:02:20 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [ALERT]    (287056) : Current worker (287060) exited with code 143 (Terminated)
Jan 20 10:02:20 np0005588919 neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a[287052]: [WARNING]  (287056) : All workers exited. Exiting... (0)
Jan 20 10:02:20 np0005588919 systemd[1]: libpod-2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7.scope: Deactivated successfully.
Jan 20 10:02:20 np0005588919 conmon[287052]: conmon 2d3a4bd0692f59e12e25 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7.scope/container/memory.events
Jan 20 10:02:20 np0005588919 podman[287113]: 2026-01-20 15:02:20.184640384 +0000 UTC m=+0.519672772 container died 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:02:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:20.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:20 np0005588919 nova_compute[225855]: 2026-01-20 15:02:20.678 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:02:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay-b849f5b9085b963f8484cdd30046286b99fb772010029aac19f41105e734dffc-merged.mount: Deactivated successfully.
Jan 20 10:02:20 np0005588919 podman[287113]: 2026-01-20 15:02:20.712976919 +0000 UTC m=+1.048009307 container cleanup 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:02:20 np0005588919 systemd[1]: libpod-conmon-2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7.scope: Deactivated successfully.
Jan 20 10:02:20 np0005588919 podman[287213]: 2026-01-20 15:02:20.781573854 +0000 UTC m=+0.042198361 container remove 2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.788 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6c340738-3b21-4f98-8537-5e4eb94048c7]: (4, ('Tue Jan 20 03:02:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a (2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7)\n2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7\nTue Jan 20 03:02:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a (2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7)\n2d3a4bd0692f59e12e25a197a7fb7fc22e2f341a10b58391cee2862341e502e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.790 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7191af5d-3ed5-41aa-b391-4b77d840a9d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.791 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0296a21f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:20 np0005588919 nova_compute[225855]: 2026-01-20 15:02:20.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:20 np0005588919 kernel: tap0296a21f-60: left promiscuous mode
Jan 20 10:02:20 np0005588919 nova_compute[225855]: 2026-01-20 15:02:20.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.816 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8c679a-333f-438c-97fb-f1f8ffa71024]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.828 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b07ca782-dac1-4090-8914-d7684b22e78b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.829 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a59ce970-9931-4673-9aef-35d4858dceb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.846 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb280cf7-0b8a-43b3-89e4-e1f93af9e20b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637461, 'reachable_time': 43720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287229, 'error': None, 'target': 'ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.849 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0296a21f-6ec4-43a7-8731-1d3692a5de4a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:02:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:20.849 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9fa55a-8ef8-4d3f-80d3-6c165cce7813]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:02:20 np0005588919 systemd[1]: run-netns-ovnmeta\x2d0296a21f\x2d6ec4\x2d43a7\x2d8731\x2d1d3692a5de4a.mount: Deactivated successfully.
Jan 20 10:02:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.456 225859 DEBUG nova.storage.rbd_utils [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] cloning vms/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk@ff4322e85df1493480d9bf54ecc676ab to images/97fb0fa0-6803-480b-96d2-4a219153376d clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.524 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.577 225859 DEBUG nova.storage.rbd_utils [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] flattening images/97fb0fa0-6803-480b-96d2-4a219153376d flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.625 225859 INFO nova.virt.libvirt.driver [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Deleting instance files /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_del#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.626 225859 INFO nova.virt.libvirt.driver [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Deletion of /var/lib/nova/instances/0a74eb9c-7f01-437d-a0c8-c01696fc8f9d_del complete#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.753 225859 INFO nova.compute.manager [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Took 2.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.754 225859 DEBUG oslo.service.loopingcall [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.755 225859 DEBUG nova.compute.manager [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.755 225859 DEBUG nova.network.neutron [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:02:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:21.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:21 np0005588919 nova_compute[225855]: 2026-01-20 15:02:21.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.079 225859 DEBUG nova.storage.rbd_utils [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] removing snapshot(ff4322e85df1493480d9bf54ecc676ab) on rbd image(2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.341 225859 DEBUG nova.network.neutron [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.364 225859 INFO nova.compute.manager [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Took 0.61 seconds to deallocate network for instance.#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.408 225859 DEBUG nova.compute.manager [req-1abd9743-9543-4f0b-a76e-26bfdf3f3bbb req-8301eff1-6a0a-4a63-b185-14fc78858c09 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Received event network-vif-deleted-d9897519-3517-45da-be53-d342192fa380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.410 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.411 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:22.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.454 225859 DEBUG nova.storage.rbd_utils [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] creating snapshot(snap) on rbd image(97fb0fa0-6803-480b-96d2-4a219153376d) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:02:22 np0005588919 nova_compute[225855]: 2026-01-20 15:02:22.590 225859 DEBUG oslo_concurrency.processutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/33524825' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.032 225859 DEBUG oslo_concurrency.processutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.039 225859 DEBUG nova.compute.provider_tree [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.058 225859 DEBUG nova.scheduler.client.report [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.092 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.126 225859 INFO nova.scheduler.client.report [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Deleted allocations for instance 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d#033[00m
Jan 20 10:02:23 np0005588919 nova_compute[225855]: 2026-01-20 15:02:23.187 225859 DEBUG oslo_concurrency.lockutils [None req-c0d13fe6-9e14-49ab-958a-aa1da6889b1e 158563a99d4a420890aaa00b05c8bb57 654b3ce7b3644fc58f8dc9f60529320b - - default default] Lock "0a74eb9c-7f01-437d-a0c8-c01696fc8f9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 20 10:02:23 np0005588919 podman[287370]: 2026-01-20 15:02:23.774977064 +0000 UTC m=+0.084798183 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:02:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:24.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:24 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:02:24 np0005588919 nova_compute[225855]: 2026-01-20 15:02:24.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:25 np0005588919 nova_compute[225855]: 2026-01-20 15:02:25.594 225859 INFO nova.virt.libvirt.driver [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Snapshot image upload complete#033[00m
Jan 20 10:02:25 np0005588919 nova_compute[225855]: 2026-01-20 15:02:25.595 225859 INFO nova.compute.manager [None req-08ef4085-d599-4c96-ac53-ce83fd2a6586 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Took 6.73 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 10:02:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:25.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:02:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:26.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:02:26 np0005588919 nova_compute[225855]: 2026-01-20 15:02:26.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:27.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e336 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:28.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:29.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:29 np0005588919 nova_compute[225855]: 2026-01-20 15:02:29.979 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 20 10:02:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:30.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:31.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:31 np0005588919 nova_compute[225855]: 2026-01-20 15:02:31.989 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:32.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:32 np0005588919 nova_compute[225855]: 2026-01-20 15:02:32.592 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance in state 1 after 32 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:02:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:33.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:34.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:34 np0005588919 nova_compute[225855]: 2026-01-20 15:02:34.887 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921339.68305, 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:02:34 np0005588919 nova_compute[225855]: 2026-01-20 15:02:34.888 225859 INFO nova.compute.manager [-] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:02:34 np0005588919 nova_compute[225855]: 2026-01-20 15:02:34.926 225859 DEBUG nova.compute.manager [None req-4bb4e7bb-29ad-4602-8b5b-d205e42cebca - - - - - -] [instance: 0a74eb9c-7f01-437d-a0c8-c01696fc8f9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:02:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:34.941 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:02:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:34.942 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:02:34 np0005588919 nova_compute[225855]: 2026-01-20 15:02:34.954 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:34 np0005588919 nova_compute[225855]: 2026-01-20 15:02:34.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:35.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:36 np0005588919 podman[287478]: 2026-01-20 15:02:36.249145813 +0000 UTC m=+0.048333034 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 20 10:02:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:36.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:36 np0005588919 nova_compute[225855]: 2026-01-20 15:02:36.990 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:37.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:02:37.944 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:02:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:38.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:39.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:39 np0005588919 nova_compute[225855]: 2026-01-20 15:02:39.984 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:41.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:41 np0005588919 nova_compute[225855]: 2026-01-20 15:02:41.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:42.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:42Z|00633|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:42Z|00634|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:02:42 np0005588919 nova_compute[225855]: 2026-01-20 15:02:42.655 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:43 np0005588919 nova_compute[225855]: 2026-01-20 15:02:43.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:43 np0005588919 nova_compute[225855]: 2026-01-20 15:02:43.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:02:43 np0005588919 nova_compute[225855]: 2026-01-20 15:02:43.633 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance in state 1 after 43 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:02:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:43.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:44 np0005588919 nova_compute[225855]: 2026-01-20 15:02:44.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:44.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:44 np0005588919 nova_compute[225855]: 2026-01-20 15:02:44.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:45 np0005588919 nova_compute[225855]: 2026-01-20 15:02:45.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:45 np0005588919 nova_compute[225855]: 2026-01-20 15:02:45.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.377 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.377 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.377 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:02:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:46.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:46 np0005588919 nova_compute[225855]: 2026-01-20 15:02:46.996 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:47Z|00635|binding|INFO|Releasing lport 32afa112-2ec4-4d59-b6eb-a77db2858bd4 from this chassis (sb_readonly=0)
Jan 20 10:02:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:02:47Z|00636|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:02:47 np0005588919 nova_compute[225855]: 2026-01-20 15:02:47.522 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:47 np0005588919 nova_compute[225855]: 2026-01-20 15:02:47.529 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:02:47 np0005588919 nova_compute[225855]: 2026-01-20 15:02:47.546 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:02:47 np0005588919 nova_compute[225855]: 2026-01-20 15:02:47.546 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:02:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:47.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:48 np0005588919 nova_compute[225855]: 2026-01-20 15:02:48.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:48.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:49 np0005588919 nova_compute[225855]: 2026-01-20 15:02:49.991 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:50.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:51.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.039 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.394 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.394 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.395 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.395 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.395 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:52.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3186788731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:52 np0005588919 nova_compute[225855]: 2026-01-20 15:02:52.837 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.025 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.026 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.029 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000097 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.030 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000097 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.211 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.212 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4067MB free_disk=20.851417541503906GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.212 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.213 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.286 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 474cec75-3b01-411a-9074-75859d2a9ddf actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.286 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.287 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.287 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.344 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:02:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:02:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4019807425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.790 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.795 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:02:53 np0005588919 nova_compute[225855]: 2026-01-20 15:02:53.811 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:02:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:53.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:54 np0005588919 podman[287551]: 2026-01-20 15:02:54.036428001 +0000 UTC m=+0.088315303 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:02:54 np0005588919 nova_compute[225855]: 2026-01-20 15:02:54.090 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:02:54 np0005588919 nova_compute[225855]: 2026-01-20 15:02:54.091 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:02:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:54.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:54 np0005588919 nova_compute[225855]: 2026-01-20 15:02:54.676 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance in state 1 after 54 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:02:54 np0005588919 nova_compute[225855]: 2026-01-20 15:02:54.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:55.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:56.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:57 np0005588919 nova_compute[225855]: 2026-01-20 15:02:57.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:57 np0005588919 nova_compute[225855]: 2026-01-20 15:02:57.086 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:02:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2084056353' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:02:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:57.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:57 np0005588919 nova_compute[225855]: 2026-01-20 15:02:57.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:02:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:02:58 np0005588919 nova_compute[225855]: 2026-01-20 15:02:58.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:02:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:02:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:02:58.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:02:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:02:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:02:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:02:59.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:02:59 np0005588919 nova_compute[225855]: 2026-01-20 15:02:59.995 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:00.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.700 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance failed to shutdown in 60 seconds.#033[00m
Jan 20 10:03:00 np0005588919 kernel: tap244332ba-1b (unregistering): left promiscuous mode
Jan 20 10:03:00 np0005588919 NetworkManager[49104]: <info>  [1768921380.8029] device (tap244332ba-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:00 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:00Z|00637|binding|INFO|Releasing lport 244332ba-1b58-4d42-98b0-245f9460c50f from this chassis (sb_readonly=0)
Jan 20 10:03:00 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:00Z|00638|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f down in Southbound
Jan 20 10:03:00 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:00Z|00639|binding|INFO|Removing iface tap244332ba-1b ovn-installed in OVS
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.817 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.819 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:00.825 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:00.827 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:03:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:00.831 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.834 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:00.834 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e43fef-8746-4384-b951-6ad0be9c3680]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:00 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:00.836 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace which is not needed anymore#033[00m
Jan 20 10:03:00 np0005588919 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 20 10:03:00 np0005588919 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000096.scope: Consumed 2.067s CPU time.
Jan 20 10:03:00 np0005588919 systemd-machined[194361]: Machine qemu-72-instance-00000096 terminated.
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.933 225859 INFO nova.virt.libvirt.driver [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance destroyed successfully.#033[00m
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.934 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'numa_topology' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:00 np0005588919 nova_compute[225855]: 2026-01-20 15:03:00.951 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Attempting a stable device rescue#033[00m
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [NOTICE]   (285691) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [NOTICE]   (285691) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [WARNING]  (285691) : Exiting Master process...
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [WARNING]  (285691) : Exiting Master process...
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [ALERT]    (285691) : Current worker (285693) exited with code 143 (Terminated)
Jan 20 10:03:00 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[285687]: [WARNING]  (285691) : All workers exited. Exiting... (0)
Jan 20 10:03:00 np0005588919 systemd[1]: libpod-c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84.scope: Deactivated successfully.
Jan 20 10:03:00 np0005588919 podman[287658]: 2026-01-20 15:03:00.988706915 +0000 UTC m=+0.053451189 container died c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 10:03:01 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:01 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9583a611064af7f6e72be93b7a47c8363e51f876af054788b7c0ed954b9e3b3d-merged.mount: Deactivated successfully.
Jan 20 10:03:01 np0005588919 podman[287658]: 2026-01-20 15:03:01.025158753 +0000 UTC m=+0.089903057 container cleanup c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:03:01 np0005588919 systemd[1]: libpod-conmon-c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84.scope: Deactivated successfully.
Jan 20 10:03:01 np0005588919 podman[287692]: 2026-01-20 15:03:01.113554097 +0000 UTC m=+0.063988707 container remove c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.121 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a186a4d1-8063-463c-a368-1181aaf39d7f]: (4, ('Tue Jan 20 03:03:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84)\nc3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84\nTue Jan 20 03:03:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (c3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84)\nc3195ecf870ff81516979fba2a40a24375f9afe5dc36f02bd45e63cf475acd84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.123 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3db5cfad-c551-4141-b656-9b4aabae0bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.123 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588919 kernel: tap671e28d0-00: left promiscuous mode
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.148 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[678f65af-d52b-4c0e-b9b9-e623bb6c8fd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.167 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fa196537-0594-4955-b06e-58ce0c409a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.168 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41599865-25eb-47e5-b4d0-4195e257bcf6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.184 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ac02020c-a52b-425d-bd6b-e89c82c4732f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635029, 'reachable_time': 43720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287709, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.187 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:01.187 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[36d5aa68-b993-4d9a-a226-5c8476dc36c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:01 np0005588919 systemd[1]: run-netns-ovnmeta\x2d671e28d0\x2d0b9e\x2d41e0\x2db5e0\x2ddb1ccd4717ec.mount: Deactivated successfully.
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.341 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.347 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.348 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Creating image(s)#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.379 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.384 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.631 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.659 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.663 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "79f6afbb8111f4bd3cacc8182575e32c185fa390" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.664 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "79f6afbb8111f4bd3cacc8182575e32c185fa390" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.819 225859 DEBUG nova.compute.manager [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.819 225859 DEBUG oslo_concurrency.lockutils [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.819 225859 DEBUG oslo_concurrency.lockutils [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.820 225859 DEBUG oslo_concurrency.lockutils [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.820 225859 DEBUG nova.compute.manager [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.820 225859 WARNING nova.compute.manager [req-aa6982d4-ac7c-4fa5-b983-76cc5d59e4cf req-6912e038-dc88-4ebe-b2f8-d2fe8adc5c1f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:03:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:01.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:01 np0005588919 nova_compute[225855]: 2026-01-20 15:03:01.994 225859 DEBUG nova.virt.libvirt.imagebackend [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/9c1c8ad1-376e-4dd8-93d8-70f0aa412977/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/9c1c8ad1-376e-4dd8-93d8-70f0aa412977/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.048 225859 DEBUG nova.virt.libvirt.imagebackend [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/9c1c8ad1-376e-4dd8-93d8-70f0aa412977/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.049 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] cloning images/9c1c8ad1-376e-4dd8-93d8-70f0aa412977@snap to None/474cec75-3b01-411a-9074-75859d2a9ddf_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.077 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.287 225859 DEBUG oslo_concurrency.lockutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "79f6afbb8111f4bd3cacc8182575e32c185fa390" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.338 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'migration_context' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.379 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.382 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start _get_guest_xml network_info=[{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:6f:36:24"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9c1c8ad1-376e-4dd8-93d8-70f0aa412977', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-41743468-7add-45cb-bc94-02eb6f850278', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '41743468-7add-45cb-bc94-02eb6f850278', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '474cec75-3b01-411a-9074-75859d2a9ddf', 'attached_at': '', 'detached_at': '', 'volume_id': '41743468-7add-45cb-bc94-02eb6f850278', 'serial': '41743468-7add-45cb-bc94-02eb6f850278'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '20658306-e0e7-4d9c-a904-24cfdd1b82ee', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.382 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'resources' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.407 225859 WARNING nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.413 225859 DEBUG nova.virt.libvirt.host [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.413 225859 DEBUG nova.virt.libvirt.host [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.416 225859 DEBUG nova.virt.libvirt.host [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.417 225859 DEBUG nova.virt.libvirt.host [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.418 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.418 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.419 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.419 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.419 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.419 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.420 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.420 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.420 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.420 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.421 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.421 225859 DEBUG nova.virt.hardware [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.421 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:02.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.478 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2658445181' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:02 np0005588919 nova_compute[225855]: 2026-01-20 15:03:02.969 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.004 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:03:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2503648110' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.509 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.511 225859 DEBUG nova.virt.libvirt.vif [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-254746207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-254746207',id=150,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:01:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-solng1yz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:01:52Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=474cec75-3b01-411a-9074-75859d2a9ddf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:6f:36:24"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.511 225859 DEBUG nova.network.os_vif_util [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:6f:36:24"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.512 225859 DEBUG nova.network.os_vif_util [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.513 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.527 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <uuid>474cec75-3b01-411a-9074-75859d2a9ddf</uuid>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <name>instance-00000096</name>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-254746207</nova:name>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:03:02</nova:creationTime>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:user uuid="2446e8399b344b29986c1aaf8bf73adf">tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member</nova:user>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:project uuid="63555e5851564db08c6429231d264f2c">tempest-ServerBootFromVolumeStableRescueTest-1871371328</nova:project>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <nova:port uuid="244332ba-1b58-4d42-98b0-245f9460c50f">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="serial">474cec75-3b01-411a-9074-75859d2a9ddf</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="uuid">474cec75-3b01-411a-9074-75859d2a9ddf</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/474cec75-3b01-411a-9074-75859d2a9ddf_disk.config">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-41743468-7add-45cb-bc94-02eb6f850278">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <serial>41743468-7add-45cb-bc94-02eb6f850278</serial>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/474cec75-3b01-411a-9074-75859d2a9ddf_disk.rescue">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <boot order="1"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:6f:36:24"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <target dev="tap244332ba-1b"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/console.log" append="off"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:03:03 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:03:03 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:03:03 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:03:03 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.534 225859 INFO nova.virt.libvirt.driver [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance destroyed successfully.#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.587 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.588 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.588 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.588 225859 DEBUG nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No VIF found with MAC fa:16:3e:6f:36:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.589 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Using config drive#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.614 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.662 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'ec2_ids' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.686 225859 DEBUG nova.objects.instance [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'keypairs' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:03.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.954 225859 DEBUG nova.compute.manager [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.955 225859 DEBUG oslo_concurrency.lockutils [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.955 225859 DEBUG oslo_concurrency.lockutils [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.955 225859 DEBUG oslo_concurrency.lockutils [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.956 225859 DEBUG nova.compute.manager [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:03 np0005588919 nova_compute[225855]: 2026-01-20 15:03:03.956 225859 WARNING nova.compute.manager [req-276513a8-845b-4dc6-9d8b-a89548569679 req-34f7f5d1-04a0-4687-b1c2-423cde9a631c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.125 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Creating config drive at /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.130 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppn39s63x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.278 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppn39s63x" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.307 225859 DEBUG nova.storage.rbd_utils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.311 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:04.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.482 225859 DEBUG oslo_concurrency.processutils [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue 474cec75-3b01-411a-9074-75859d2a9ddf_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.483 225859 INFO nova.virt.libvirt.driver [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Deleting local config drive /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:03:04 np0005588919 kernel: tap244332ba-1b: entered promiscuous mode
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.5255] manager: (tap244332ba-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 20 10:03:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:04Z|00640|binding|INFO|Claiming lport 244332ba-1b58-4d42-98b0-245f9460c50f for this chassis.
Jan 20 10:03:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:04Z|00641|binding|INFO|244332ba-1b58-4d42-98b0-245f9460c50f: Claiming fa:16:3e:6f:36:24 10.100.0.4
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.525 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.534 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.535 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec bound to our chassis#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.537 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:03:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:04Z|00642|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f ovn-installed in OVS
Jan 20 10:03:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:04Z|00643|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f up in Southbound
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.545 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.549 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dca39067-7502-49ec-a772-0f10fcd9d440]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.550 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap671e28d0-01 in ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.552 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap671e28d0-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.552 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[406d8972-37ff-4fb8-abd6-2619ea60ce28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.553 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a1911f-ad54-44c2-99f0-12ac0cc7f217]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 systemd-machined[194361]: New machine qemu-75-instance-00000096.
Jan 20 10:03:04 np0005588919 systemd-udevd[287988]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.563 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[01c36c84-113a-41d4-b148-d22c00fe3c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 systemd[1]: Started Virtual Machine qemu-75-instance-00000096.
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.5717] device (tap244332ba-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.5726] device (tap244332ba-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.587 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c56093f2-92a3-4c2e-9daa-500c8f3319c7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.612 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac0ba3c-c9bc-4b8b-8e86-257b7678f775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 systemd-udevd[287992]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.6201] manager: (tap671e28d0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.619 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b6481005-23f9-4359-8707-a7da677b0289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.651 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[91ab9a8a-cd21-41ba-99e1-497d2c23ffa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.654 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2071ac7c-02ce-4fc5-bb92-2c30d756fcbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.6747] device (tap671e28d0-00): carrier: link connected
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.679 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ff64cb72-eb1a-4861-82d5-e0eb57aa7709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.694 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0d02a7b5-6ab9-4c06-903c-411dca8daf50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642360, 'reachable_time': 36035, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288020, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.711 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1841b4-3b5b-4c04-ad05-aa17f964075c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:4e69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642360, 'tstamp': 642360}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288021, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.728 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7f3951-1249-4c91-b054-09b9dca2a8e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642360, 'reachable_time': 36035, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288022, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.764 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4ab88d-90a7-43f5-a737-f05f02e21c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.823 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[de624ca4-6d36-4d52-8c25-f1454eae3efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.824 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.825 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.825 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.826 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 NetworkManager[49104]: <info>  [1768921384.8273] manager: (tap671e28d0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Jan 20 10:03:04 np0005588919 kernel: tap671e28d0-00: entered promiscuous mode
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.828 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.829 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:04Z|00644|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.844 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.846 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.846 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[745d6a8b-68d1-4d32-933d-8ecfea7d3c93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.847 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:03:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:04.848 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'env', 'PROCESS_TAG=haproxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.984 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 474cec75-3b01-411a-9074-75859d2a9ddf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.985 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921384.9840426, 474cec75-3b01-411a-9074-75859d2a9ddf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.985 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:03:04 np0005588919 nova_compute[225855]: 2026-01-20 15:03:04.992 225859 DEBUG nova.compute.manager [None req-615f4080-5b92-4f81-a031-8b89a6db13e8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.000 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.006 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.010 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.035 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.035 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921384.988022, 474cec75-3b01-411a-9074-75859d2a9ddf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.036 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Started (Lifecycle Event)#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.069 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.073 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:05 np0005588919 podman[288114]: 2026-01-20 15:03:05.236513802 +0000 UTC m=+0.062998588 container create 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:03:05 np0005588919 systemd[1]: Started libpod-conmon-7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7.scope.
Jan 20 10:03:05 np0005588919 podman[288114]: 2026-01-20 15:03:05.201785873 +0000 UTC m=+0.028270739 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:03:05 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:03:05 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dee5eb1c0534cfb4c70b46823a0b5c9d68a80b01a6c011e78131e499ba71b19d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:03:05 np0005588919 podman[288114]: 2026-01-20 15:03:05.315660675 +0000 UTC m=+0.142145471 container init 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:03:05 np0005588919 podman[288114]: 2026-01-20 15:03:05.321401346 +0000 UTC m=+0.147886132 container start 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:03:05 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [NOTICE]   (288133) : New worker (288135) forked
Jan 20 10:03:05 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [NOTICE]   (288133) : Loading success.
Jan 20 10:03:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:05.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.953 225859 INFO nova.compute.manager [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Unrescuing#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.953 225859 DEBUG oslo_concurrency.lockutils [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.953 225859 DEBUG oslo_concurrency.lockutils [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:05 np0005588919 nova_compute[225855]: 2026-01-20 15:03:05.954 225859 DEBUG nova.network.neutron [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.084 225859 DEBUG nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.085 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.085 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.085 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.085 225859 DEBUG nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 WARNING nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 DEBUG nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 DEBUG oslo_concurrency.lockutils [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.086 225859 DEBUG nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:06 np0005588919 nova_compute[225855]: 2026-01-20 15:03:06.087 225859 WARNING nova.compute.manager [req-3351e714-3409-457d-9ef4-48b173385f5c req-46eb7f1d-2331-456f-95fb-2e9cfb0f490b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:03:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:06.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:07 np0005588919 podman[288145]: 2026-01-20 15:03:07.022604731 +0000 UTC m=+0.067551927 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 10:03:07 np0005588919 nova_compute[225855]: 2026-01-20 15:03:07.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:07 np0005588919 nova_compute[225855]: 2026-01-20 15:03:07.980 225859 DEBUG nova.network.neutron [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.010 225859 DEBUG oslo_concurrency.lockutils [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.011 225859 DEBUG nova.objects.instance [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'flavor' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:08 np0005588919 kernel: tap244332ba-1b (unregistering): left promiscuous mode
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.1736] device (tap244332ba-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00645|binding|INFO|Releasing lport 244332ba-1b58-4d42-98b0-245f9460c50f from this chassis (sb_readonly=0)
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00646|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f down in Southbound
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00647|binding|INFO|Removing iface tap244332ba-1b ovn-installed in OVS
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.191 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.193 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.195 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.195 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5039fbe9-0b56-4713-a74c-fab2f4882032]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.196 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace which is not needed anymore#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.201 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 20 10:03:08 np0005588919 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000096.scope: Consumed 3.695s CPU time.
Jan 20 10:03:08 np0005588919 systemd-machined[194361]: Machine qemu-75-instance-00000096 terminated.
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.280 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.285 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.294 225859 INFO nova.virt.libvirt.driver [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance destroyed successfully.#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.295 225859 DEBUG nova.objects.instance [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'numa_topology' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:08 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [NOTICE]   (288133) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:08 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [NOTICE]   (288133) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:08 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [WARNING]  (288133) : Exiting Master process...
Jan 20 10:03:08 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [ALERT]    (288133) : Current worker (288135) exited with code 143 (Terminated)
Jan 20 10:03:08 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288129]: [WARNING]  (288133) : All workers exited. Exiting... (0)
Jan 20 10:03:08 np0005588919 systemd[1]: libpod-7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7.scope: Deactivated successfully.
Jan 20 10:03:08 np0005588919 podman[288192]: 2026-01-20 15:03:08.337168697 +0000 UTC m=+0.050506796 container died 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:03:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:08 np0005588919 systemd[1]: var-lib-containers-storage-overlay-dee5eb1c0534cfb4c70b46823a0b5c9d68a80b01a6c011e78131e499ba71b19d-merged.mount: Deactivated successfully.
Jan 20 10:03:08 np0005588919 podman[288192]: 2026-01-20 15:03:08.376364633 +0000 UTC m=+0.089702732 container cleanup 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:03:08 np0005588919 kernel: tap244332ba-1b: entered promiscuous mode
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.3834] manager: (tap244332ba-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/278)
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.383 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 systemd[1]: libpod-conmon-7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7.scope: Deactivated successfully.
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00648|binding|INFO|Claiming lport 244332ba-1b58-4d42-98b0-245f9460c50f for this chassis.
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00649|binding|INFO|244332ba-1b58-4d42-98b0-245f9460c50f: Claiming fa:16:3e:6f:36:24 10.100.0.4
Jan 20 10:03:08 np0005588919 systemd-udevd[288169]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.393 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.3960] device (tap244332ba-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.3968] device (tap244332ba-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00650|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f ovn-installed in OVS
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00651|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f up in Southbound
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.406 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 systemd-machined[194361]: New machine qemu-76-instance-00000096.
Jan 20 10:03:08 np0005588919 systemd[1]: Started Virtual Machine qemu-76-instance-00000096.
Jan 20 10:03:08 np0005588919 podman[288241]: 2026-01-20 15:03:08.447196571 +0000 UTC m=+0.044592959 container remove 7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.455 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b5d68e-a44d-4c04-a179-18b7495e279f]: (4, ('Tue Jan 20 03:03:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7)\n7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7\nTue Jan 20 03:03:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7)\n7e9df3d86ba0ee1b1089109f5e9eeba2832f1bd1a9b80703602546b20ab605e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.458 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6654953d-a13f-487d-8011-d9d6468a3798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.459 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588919 kernel: tap671e28d0-00: left promiscuous mode
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.461 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:08.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.478 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd23492-45f6-4226-ab63-1066c0f9ff35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.494 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cd110acc-2fb1-492d-a51e-33e24f8215b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.495 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb0e61e-36e6-41c5-9bbe-58a3db1eb1f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.512 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ea691813-0599-46bd-a88f-d5b7d2318912]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642353, 'reachable_time': 36789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288263, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 systemd[1]: run-netns-ovnmeta\x2d671e28d0\x2d0b9e\x2d41e0\x2db5e0\x2ddb1ccd4717ec.mount: Deactivated successfully.
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.517 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.517 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[98e7e985-caa6-4fcc-9101-9c74d8eeac0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.518 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.520 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.530 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9210d910-ee55-405e-aaad-eddbc90e8e25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.531 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap671e28d0-01 in ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.534 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap671e28d0-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.534 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[52129f3f-6ab2-4537-b331-89f0f2612650]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.535 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51617bc7-118d-483b-a2aa-e17d9eabb404]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.547 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[a82d2654-bc40-452d-b927-87dc4cae4c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.567 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d6325685-53bb-451d-8b08-f3404f376c0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.595 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[769ff5de-fc8d-47bc-85ef-97250f8529bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.6014] manager: (tap671e28d0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/279)
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.600 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[37e19214-88f3-4c00-83ea-36e610a70e5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.641 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[26503e07-036e-49f2-b6b7-2424cf2d8f29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.644 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c98acd26-bb15-4fc6-af44-a5aa848fc080]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.6645] device (tap671e28d0-00): carrier: link connected
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.670 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[54477511-f7d1-4f8f-8167-15bb69b12dfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.685 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dc64bb97-3066-4a1e-9215-2d6350f8128a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288288, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.703 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ddfd07-21a4-4a2d-b89c-e73711c27b7a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:4e69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642759, 'tstamp': 642759}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288289, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.723 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5b96edec-c371-4b21-9b81-fd9401907627]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288290, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.752 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c2d831-e70e-46bd-9c59-d02218f1cda1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.809 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1b56bc6b-a821-44c4-b585-9bb1a9238f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.811 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.811 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.812 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 NetworkManager[49104]: <info>  [1768921388.8144] manager: (tap671e28d0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 20 10:03:08 np0005588919 kernel: tap671e28d0-00: entered promiscuous mode
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.815 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.816 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.817 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:08Z|00652|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:03:08 np0005588919 nova_compute[225855]: 2026-01-20 15:03:08.832 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.833 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.834 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d862f45-9651-4460-917a-999facffe7c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.835 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.pid.haproxy
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:03:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:08.835 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'env', 'PROCESS_TAG=haproxy-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/671e28d0-0b9e-41e0-b5e0-db1ccd4717ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.067 225859 DEBUG nova.compute.manager [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.068 225859 DEBUG oslo_concurrency.lockutils [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.068 225859 DEBUG oslo_concurrency.lockutils [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.068 225859 DEBUG oslo_concurrency.lockutils [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.068 225859 DEBUG nova.compute.manager [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.068 225859 WARNING nova.compute.manager [req-c3a5acde-fb08-461b-94cb-65a7665133c6 req-e8c26888-f98d-4b0c-93dc-2cd72015a302 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:03:09 np0005588919 podman[288363]: 2026-01-20 15:03:09.218584443 +0000 UTC m=+0.045524096 container create 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.219 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 474cec75-3b01-411a-9074-75859d2a9ddf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.220 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921389.2193475, 474cec75-3b01-411a-9074-75859d2a9ddf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.220 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.250 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:09 np0005588919 systemd[1]: Started libpod-conmon-791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e.scope.
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.255 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:09 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.280 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.280 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921389.2208128, 474cec75-3b01-411a-9074-75859d2a9ddf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.280 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Started (Lifecycle Event)#033[00m
Jan 20 10:03:09 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a9b0c9e2a81eedae01229880b31ef78cadc4986b7f57eb0b4c053b0733a83f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:03:09 np0005588919 podman[288363]: 2026-01-20 15:03:09.197308552 +0000 UTC m=+0.024248225 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:03:09 np0005588919 podman[288363]: 2026-01-20 15:03:09.29893747 +0000 UTC m=+0.125877133 container init 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.302 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:09 np0005588919 podman[288363]: 2026-01-20 15:03:09.304756174 +0000 UTC m=+0.131695827 container start 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.306 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.327 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:03:09 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [NOTICE]   (288401) : New worker (288403) forked
Jan 20 10:03:09 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [NOTICE]   (288401) : Loading success.
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:09 np0005588919 nova_compute[225855]: 2026-01-20 15:03:09.652 225859 DEBUG nova.compute.manager [None req-bdcbe869-5a54-47f5-8e01-6c8bd5494045 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:10 np0005588919 nova_compute[225855]: 2026-01-20 15:03:10.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:10.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:10 np0005588919 nova_compute[225855]: 2026-01-20 15:03:10.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.211 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.212 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.212 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.212 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.212 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.213 225859 WARNING nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state None.#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.213 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.213 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.213 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.213 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.214 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.214 225859 WARNING nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state None.#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.214 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.214 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.214 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.215 225859 DEBUG oslo_concurrency.lockutils [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.215 225859 DEBUG nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:11 np0005588919 nova_compute[225855]: 2026-01-20 15:03:11.215 225859 WARNING nova.compute.manager [req-eacff317-3504-4bd7-b253-ff0c6d8b0780 req-5de96166-8d49-4003-bd2a-5fcace0cd512 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state None.#033[00m
Jan 20 10:03:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:11.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:12 np0005588919 nova_compute[225855]: 2026-01-20 15:03:12.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:12.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e337 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:13.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:14.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:15 np0005588919 nova_compute[225855]: 2026-01-20 15:03:15.003 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:15 np0005588919 nova_compute[225855]: 2026-01-20 15:03:15.414 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:16.423 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:16.424 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:16.424 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:16.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:17 np0005588919 nova_compute[225855]: 2026-01-20 15:03:17.107 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 20 10:03:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:18.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 20 10:03:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 20 10:03:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:19.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:20 np0005588919 nova_compute[225855]: 2026-01-20 15:03:20.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:20.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:21.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:22 np0005588919 nova_compute[225855]: 2026-01-20 15:03:22.109 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:22.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 20 10:03:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:23.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:24 np0005588919 podman[288521]: 2026-01-20 15:03:24.168806405 +0000 UTC m=+0.087088138 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:03:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:24.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:25 np0005588919 nova_compute[225855]: 2026-01-20 15:03:25.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 20 10:03:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:03:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:25 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:03:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:26.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:27 np0005588919 nova_compute[225855]: 2026-01-20 15:03:27.111 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:28.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:29.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:30 np0005588919 nova_compute[225855]: 2026-01-20 15:03:30.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:30.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 20 10:03:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:31 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:03:31 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #52. Immutable memtables: 8.
Jan 20 10:03:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:31.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:32 np0005588919 nova_compute[225855]: 2026-01-20 15:03:32.113 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 20 10:03:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:32.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 20 10:03:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:33.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:34.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:35 np0005588919 nova_compute[225855]: 2026-01-20 15:03:35.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 20 10:03:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:35.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:36.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:37 np0005588919 nova_compute[225855]: 2026-01-20 15:03:37.039 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:37.042 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:37.043 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:03:37 np0005588919 nova_compute[225855]: 2026-01-20 15:03:37.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:37.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e346 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:38 np0005588919 podman[288736]: 2026-01-20 15:03:38.011751066 +0000 UTC m=+0.049912189 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 10:03:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:38.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 20 10:03:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:39.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:40 np0005588919 nova_compute[225855]: 2026-01-20 15:03:40.012 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 20 10:03:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:41.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:42.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.876 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.877 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.877 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.878 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.878 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.879 225859 INFO nova.compute.manager [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Terminating instance#033[00m
Jan 20 10:03:42 np0005588919 nova_compute[225855]: 2026-01-20 15:03:42.880 225859 DEBUG nova.compute.manager [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:03:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:43 np0005588919 kernel: tap6216baae-33 (unregistering): left promiscuous mode
Jan 20 10:03:43 np0005588919 NetworkManager[49104]: <info>  [1768921423.1911] device (tap6216baae-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:03:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:43Z|00653|binding|INFO|Releasing lport 6216baae-337d-44a3-aa38-60c2afb5d13f from this chassis (sb_readonly=0)
Jan 20 10:03:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:43Z|00654|binding|INFO|Setting lport 6216baae-337d-44a3-aa38-60c2afb5d13f down in Southbound
Jan 20 10:03:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:03:43Z|00655|binding|INFO|Removing iface tap6216baae-33 ovn-installed in OVS
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Deactivated successfully.
Jan 20 10:03:43 np0005588919 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000097.scope: Consumed 17.836s CPU time.
Jan 20 10:03:43 np0005588919 systemd-machined[194361]: Machine qemu-73-instance-00000097 terminated.
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.321 225859 INFO nova.virt.libvirt.driver [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance destroyed successfully.#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.322 225859 DEBUG nova.objects.instance [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lazy-loading 'resources' on Instance uuid 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.430 225859 DEBUG nova.compute.manager [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.430 225859 DEBUG nova.compute.manager [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing instance network info cache due to event network-changed-6216baae-337d-44a3-aa38-60c2afb5d13f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.430 225859 DEBUG oslo_concurrency.lockutils [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.430 225859 DEBUG oslo_concurrency.lockutils [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.430 225859 DEBUG nova.network.neutron [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Refreshing network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.440 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b9:ea 10.100.0.12'], port_security=['fa:16:3e:87:b9:ea 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a1d679d5c954662a271e842fe2f2c05', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f11f0ae2-6b78-4d57-a9ea-5a7c52439262', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=773a665f-440e-445e-8ca6-20a8b67e017a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6216baae-337d-44a3-aa38-60c2afb5d13f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.441 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6216baae-337d-44a3-aa38-60c2afb5d13f in datapath 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad unbound from our chassis#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.443 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.444 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5922f7-0d49-4696-89a2-60eaa06cfaa1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.445 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad namespace which is not needed anymore#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.490 225859 DEBUG nova.virt.libvirt.vif [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:01:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2070424486',display_name='tempest-TestSnapshotPattern-server-2070424486',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2070424486',id=151,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHt2Pjp5fO1h9ikmCXDj2fSFlpzjIfjh7jCgXMa0An0AiWgQhFRQBExuSvqHDwsNMcN7FUPQzPGoYvUkqz0I21jbk9kMja07pP6W664P26WxVinBA8YoIkVl5tlHownM8g==',key_name='tempest-TestSnapshotPattern-503298877',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:01:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a1d679d5c954662a271e842fe2f2c05',ramdisk_id='',reservation_id='r-4u8oxks9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1341092631',owner_user_name='tempest-TestSnapshotPattern-1341092631-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:02:25Z,user_data=None,user_id='1654794111844ca88666b3529173e9a7',uuid=2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.490 225859 DEBUG nova.network.os_vif_util [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converting VIF {"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.491 225859 DEBUG nova.network.os_vif_util [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.492 225859 DEBUG os_vif [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.494 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.495 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6216baae-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.498 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.507 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.510 225859 INFO os_vif [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b9:ea,bridge_name='br-int',has_traffic_filtering=True,id=6216baae-337d-44a3-aa38-60c2afb5d13f,network=Network(43d3be8f-9be1-4892-bbfe-d0ba2d7157ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6216baae-33')#033[00m
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [NOTICE]   (285893) : haproxy version is 2.8.14-c23fe91
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [NOTICE]   (285893) : path to executable is /usr/sbin/haproxy
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [WARNING]  (285893) : Exiting Master process...
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [WARNING]  (285893) : Exiting Master process...
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [ALERT]    (285893) : Current worker (285895) exited with code 143 (Terminated)
Jan 20 10:03:43 np0005588919 neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad[285889]: [WARNING]  (285893) : All workers exited. Exiting... (0)
Jan 20 10:03:43 np0005588919 systemd[1]: libpod-858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70.scope: Deactivated successfully.
Jan 20 10:03:43 np0005588919 podman[288808]: 2026-01-20 15:03:43.587980371 +0000 UTC m=+0.048347205 container died 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:03:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70-userdata-shm.mount: Deactivated successfully.
Jan 20 10:03:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a768a9de3a5bcf02ee70e03b9ac0d04f5f61e6aa1b57b9c32ed35cc799999e46-merged.mount: Deactivated successfully.
Jan 20 10:03:43 np0005588919 podman[288808]: 2026-01-20 15:03:43.629381379 +0000 UTC m=+0.089748203 container cleanup 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:03:43 np0005588919 systemd[1]: libpod-conmon-858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70.scope: Deactivated successfully.
Jan 20 10:03:43 np0005588919 podman[288843]: 2026-01-20 15:03:43.700145585 +0000 UTC m=+0.046805001 container remove 858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.706 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4cfe09-3e74-4d9a-9ac9-57502967576d]: (4, ('Tue Jan 20 03:03:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad (858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70)\n858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70\nTue Jan 20 03:03:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad (858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70)\n858464b04d58a5cb3b3a1293894336ed3fa1c40b3e83ce1971c3bfe8ff0e8d70\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.707 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45ac42af-7062-4d46-8341-588b36dfb22b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.708 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43d3be8f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 kernel: tap43d3be8f-90: left promiscuous mode
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.727 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8f47a1-2a73-456f-83c8-315d28dd7289]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.754 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bec636d5-ee60-45a4-badd-6ae9e6cf72d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.756 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[20f645c2-31b8-4c56-be7e-6f7094c078c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.771 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[28a9a447-3c6e-420a-80b3-b7caeed14da1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635244, 'reachable_time': 33822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288858, 'error': None, 'target': 'ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 systemd[1]: run-netns-ovnmeta\x2d43d3be8f\x2d9be1\x2d4892\x2dbbfe\x2dd0ba2d7157ad.mount: Deactivated successfully.
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.775 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-43d3be8f-9be1-4892-bbfe-d0ba2d7157ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:03:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:43.776 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3b466783-0f2f-4fcc-a02b-9eca4f4b49e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:03:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:43.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.985 225859 INFO nova.virt.libvirt.driver [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Deleting instance files /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_del#033[00m
Jan 20 10:03:43 np0005588919 nova_compute[225855]: 2026-01-20 15:03:43.986 225859 INFO nova.virt.libvirt.driver [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Deletion of /var/lib/nova/instances/2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1_del complete#033[00m
Jan 20 10:03:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:44.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.119 225859 DEBUG nova.compute.manager [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-unplugged-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.120 225859 DEBUG oslo_concurrency.lockutils [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.120 225859 DEBUG oslo_concurrency.lockutils [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.120 225859 DEBUG oslo_concurrency.lockutils [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.121 225859 DEBUG nova.compute.manager [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] No waiting events found dispatching network-vif-unplugged-6216baae-337d-44a3-aa38-60c2afb5d13f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.121 225859 DEBUG nova.compute.manager [req-43928d7a-a92d-489b-b123-cc4e9622df67 req-2e5744cb-58da-4641-96fb-7b6c44ba0358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-unplugged-6216baae-337d-44a3-aa38-60c2afb5d13f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.212 225859 INFO nova.compute.manager [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Took 2.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.213 225859 DEBUG oslo.service.loopingcall [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.213 225859 DEBUG nova.compute.manager [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.214 225859 DEBUG nova.network.neutron [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:03:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.918 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.918 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:45 np0005588919 nova_compute[225855]: 2026-01-20 15:03:45.957 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:03:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:45.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.115 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.115 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.123 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.123 225859 INFO nova.compute.claims [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.409 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:46.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.753 225859 DEBUG nova.network.neutron [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/375679175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.856 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:46 np0005588919 nova_compute[225855]: 2026-01-20 15:03:46.862 225859 DEBUG nova.compute.provider_tree [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.008 225859 DEBUG nova.network.neutron [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updated VIF entry in instance network info cache for port 6216baae-337d-44a3-aa38-60c2afb5d13f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.009 225859 DEBUG nova.network.neutron [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [{"id": "6216baae-337d-44a3-aa38-60c2afb5d13f", "address": "fa:16:3e:87:b9:ea", "network": {"id": "43d3be8f-9be1-4892-bbfe-d0ba2d7157ad", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1740636070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a1d679d5c954662a271e842fe2f2c05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6216baae-33", "ovs_interfaceid": "6216baae-337d-44a3-aa38-60c2afb5d13f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:03:47.044 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.682 225859 DEBUG nova.compute.manager [req-d6b6cc3b-38a7-4d76-bf05-85c0e612f3d3 req-c11dd991-7154-4c0d-b404-a77db5b06fb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-deleted-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.682 225859 INFO nova.compute.manager [req-d6b6cc3b-38a7-4d76-bf05-85c0e612f3d3 req-c11dd991-7154-4c0d-b404-a77db5b06fb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Neutron deleted interface 6216baae-337d-44a3-aa38-60c2afb5d13f; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.683 225859 DEBUG nova.network.neutron [req-d6b6cc3b-38a7-4d76-bf05-85c0e612f3d3 req-c11dd991-7154-4c0d-b404-a77db5b06fb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.706 225859 DEBUG nova.scheduler.client.report [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.762 225859 DEBUG oslo_concurrency.lockutils [req-a3481e62-fec8-4292-9bb1-45fb97d9aaf5 req-9a2cfc97-eb2c-4f0a-ae47-39363ba4a923 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.768 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.769 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.772 225859 INFO nova.compute.manager [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Took 2.56 seconds to deallocate network for instance.#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.778 225859 DEBUG nova.compute.manager [req-d6b6cc3b-38a7-4d76-bf05-85c0e612f3d3 req-c11dd991-7154-4c0d-b404-a77db5b06fb2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Detach interface failed, port_id=6216baae-337d-44a3-aa38-60c2afb5d13f, reason: Instance 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.832 225859 DEBUG nova.compute.manager [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.833 225859 DEBUG oslo_concurrency.lockutils [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.833 225859 DEBUG oslo_concurrency.lockutils [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.833 225859 DEBUG oslo_concurrency.lockutils [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.834 225859 DEBUG nova.compute.manager [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] No waiting events found dispatching network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.834 225859 WARNING nova.compute.manager [req-387e23a1-1366-456c-a24c-59ecf75e57ce req-c33055b8-0320-4986-bfee-d8f334dd9690 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Received unexpected event network-vif-plugged-6216baae-337d-44a3-aa38-60c2afb5d13f for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.878 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.879 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.913 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.914 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.915 225859 INFO nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:03:47 np0005588919 nova_compute[225855]: 2026-01-20 15:03:47.933 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:03:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:47.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.026 225859 INFO nova.virt.block_device [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Booting with volume-backed-image a32b3e07-16d8-46fd-9a7b-c242c432fcf9 at /dev/vda#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.106 225859 DEBUG oslo_concurrency.processutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.424 225859 DEBUG nova.policy [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2446e8399b344b29986c1aaf8bf73adf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63555e5851564db08c6429231d264f2c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.498 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:48.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1811257346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.569 225859 DEBUG oslo_concurrency.processutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.575 225859 DEBUG nova.compute.provider_tree [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.712 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:48 np0005588919 nova_compute[225855]: 2026-01-20 15:03:48.712 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:03:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:49.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.450 225859 DEBUG nova.scheduler.client.report [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.495 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.708 225859 INFO nova.scheduler.client.report [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Deleted allocations for instance 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1#033[00m
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.794 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.794 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.794 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:03:50 np0005588919 nova_compute[225855]: 2026-01-20 15:03:50.831 225859 DEBUG oslo_concurrency.lockutils [None req-b9235182-8291-45c2-a2b2-fc1ae2b3ea60 1654794111844ca88666b3529173e9a7 3a1d679d5c954662a271e842fe2f2c05 - - default default] Lock "2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.954s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:51 np0005588919 nova_compute[225855]: 2026-01-20 15:03:51.836 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:03:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:51.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:52 np0005588919 nova_compute[225855]: 2026-01-20 15:03:52.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:03:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:52.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:03:52 np0005588919 nova_compute[225855]: 2026-01-20 15:03:52.607 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Successfully created port: 22663aa0-a7f4-431c-b5a9-4433da2dff09 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:03:52 np0005588919 nova_compute[225855]: 2026-01-20 15:03:52.978 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:03:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.094 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.095 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.096 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3930603493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.396 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.396 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.396 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.397 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.397 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.500 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1886119184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:53 np0005588919 nova_compute[225855]: 2026-01-20 15:03:53.864 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:53.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:03:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:54.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:03:55 np0005588919 podman[288932]: 2026-01-20 15:03:55.073684442 +0000 UTC m=+0.119617936 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.469 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.470 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.634 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.636 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4295MB free_disk=20.830543518066406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.636 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:03:55 np0005588919 nova_compute[225855]: 2026-01-20 15:03:55.636 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:03:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:56.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.521013) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436521040, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 2502, "num_deletes": 257, "total_data_size": 5667805, "memory_usage": 5746816, "flush_reason": "Manual Compaction"}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436559362, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 3702117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55213, "largest_seqno": 57709, "table_properties": {"data_size": 3691912, "index_size": 6507, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21936, "raw_average_key_size": 21, "raw_value_size": 3671190, "raw_average_value_size": 3519, "num_data_blocks": 281, "num_entries": 1043, "num_filter_entries": 1043, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921251, "oldest_key_time": 1768921251, "file_creation_time": 1768921436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 38426 microseconds, and 7615 cpu microseconds.
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.559430) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 3702117 bytes OK
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.559459) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.562840) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.562920) EVENT_LOG_v1 {"time_micros": 1768921436562907, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.562949) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 5656670, prev total WAL file size 5656670, number of live WAL files 2.
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.565562) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(3615KB)], [108(11MB)]
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436565647, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 15695572, "oldest_snapshot_seqno": -1}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 8531 keys, 13825127 bytes, temperature: kUnknown
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436787564, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 13825127, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13765827, "index_size": 36852, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21381, "raw_key_size": 219971, "raw_average_key_size": 25, "raw_value_size": 13611672, "raw_average_value_size": 1595, "num_data_blocks": 1451, "num_entries": 8531, "num_filter_entries": 8531, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.787817) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 13825127 bytes
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790102) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 70.7 rd, 62.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 11.4 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 9063, records dropped: 532 output_compression: NoCompression
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.790126) EVENT_LOG_v1 {"time_micros": 1768921436790117, "job": 68, "event": "compaction_finished", "compaction_time_micros": 221983, "compaction_time_cpu_micros": 35279, "output_level": 6, "num_output_files": 1, "total_output_size": 13825127, "num_input_records": 9063, "num_output_records": 8531, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436790836, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921436793131, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.565418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.793223) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.793230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.793233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.793237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:03:56.793240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.796 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 474cec75-3b01-411a-9074-75859d2a9ddf actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.797 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance c1db561e-0c8b-4cfb-97bb-55f8d4731b87 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.797 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.798 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.843 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.910 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.911 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.930 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:03:56 np0005588919 nova_compute[225855]: 2026-01-20 15:03:56.988 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:03:57 np0005588919 nova_compute[225855]: 2026-01-20 15:03:57.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:57 np0005588919 nova_compute[225855]: 2026-01-20 15:03:57.127 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:03:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:03:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3864538602' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:03:57 np0005588919 nova_compute[225855]: 2026-01-20 15:03:57.662 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:03:57 np0005588919 nova_compute[225855]: 2026-01-20 15:03:57.668 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:03:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:03:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.320 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921423.3193264, 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.321 225859 INFO nova.compute.manager [-] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.502 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:03:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:03:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:03:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:03:58.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.536 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.574 225859 DEBUG nova.compute.manager [None req-7cde4017-05a0-4550-b14a-1a1c814fc584 - - - - - -] [instance: 2b9353e4-2bd0-4a4a-b9a1-dd24929a4af1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.629 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:03:58 np0005588919 nova_compute[225855]: 2026-01-20 15:03:58.629 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.036 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Successfully updated port: 22663aa0-a7f4-431c-b5a9-4433da2dff09 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.104 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.104 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.104 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.591 225859 DEBUG nova.compute.manager [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-changed-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.591 225859 DEBUG nova.compute.manager [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Refreshing instance network info cache due to event network-changed-22663aa0-a7f4-431c-b5a9-4433da2dff09. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.592 225859 DEBUG oslo_concurrency.lockutils [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:03:59 np0005588919 nova_compute[225855]: 2026-01-20 15:03:59.941 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:04:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:00.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:00.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:02.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:02 np0005588919 nova_compute[225855]: 2026-01-20 15:04:02.127 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:02.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:02 np0005588919 nova_compute[225855]: 2026-01-20 15:04:02.625 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:02 np0005588919 nova_compute[225855]: 2026-01-20 15:04:02.625 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:03 np0005588919 nova_compute[225855]: 2026-01-20 15:04:03.298 225859 DEBUG nova.network.neutron [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating instance_info_cache with network_info: [{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:03 np0005588919 nova_compute[225855]: 2026-01-20 15:04:03.504 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:04.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:04:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:04.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.701 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.701 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance network_info: |[{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.702 225859 DEBUG oslo_concurrency.lockutils [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.702 225859 DEBUG nova.network.neutron [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Refreshing network info cache for port 22663aa0-a7f4-431c-b5a9-4433da2dff09 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:04:04 np0005588919 nova_compute[225855]: 2026-01-20 15:04:04.718 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:04:05 np0005588919 nova_compute[225855]: 2026-01-20 15:04:05.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:05 np0005588919 nova_compute[225855]: 2026-01-20 15:04:05.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:04:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:06.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:06.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:07 np0005588919 nova_compute[225855]: 2026-01-20 15:04:07.128 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:08.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:08 np0005588919 nova_compute[225855]: 2026-01-20 15:04:08.506 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:08.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:08Z|00656|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:04:08 np0005588919 nova_compute[225855]: 2026-01-20 15:04:08.608 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:08Z|00657|binding|INFO|Releasing lport a8628d9e-196f-4b84-89fd-d3a41792b8a0 from this chassis (sb_readonly=0)
Jan 20 10:04:08 np0005588919 nova_compute[225855]: 2026-01-20 15:04:08.889 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:09 np0005588919 podman[289038]: 2026-01-20 15:04:09.028783139 +0000 UTC m=+0.055238629 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.072 225859 DEBUG os_brick.utils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.073 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.084 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.084 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[cf56d54c-1c88-450a-9219-528da2357c16]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.085 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.093 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.093 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[24ec3c1a-4d08-4507-99dd-2a0eadffc8d3]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.094 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.102 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.103 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[9271998a-6cb6-453a-8ac5-b955a9f9fc55]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.104 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d8197c-0c04-4c92-bf48-7d1c0298c337]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.104 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.130 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.132 225859 DEBUG os_brick.initiator.connectors.lightos [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.133 225859 DEBUG os_brick.initiator.connectors.lightos [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.133 225859 DEBUG os_brick.initiator.connectors.lightos [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.134 225859 DEBUG os_brick.utils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] <== get_connector_properties: return (61ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:04:09 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.134 225859 DEBUG nova.virt.block_device [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating existing volume attachment record: 2e3b09a8-1ad6-45d8-9497-37288b706f5c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:04:10 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.998 225859 DEBUG nova.network.neutron [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updated VIF entry in instance network info cache for port 22663aa0-a7f4-431c-b5a9-4433da2dff09. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:04:10 np0005588919 nova_compute[225855]: 2026-01-20 15:04:09.999 225859 DEBUG nova.network.neutron [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating instance_info_cache with network_info: [{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:10.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:10 np0005588919 nova_compute[225855]: 2026-01-20 15:04:10.018 225859 DEBUG oslo_concurrency.lockutils [req-fe260f39-b2f0-46ba-b8ac-a1e47eb5575d req-97a01128-eebf-4fec-8a5c-24cede52db50 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:10.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:12.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:12 np0005588919 nova_compute[225855]: 2026-01-20 15:04:12.196 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:12.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.196 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.198 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.198 225859 INFO nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Creating image(s)#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.199 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.199 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Ensure instance console log exists: /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.199 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.199 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.200 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.202 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start _get_guest_xml network_info=[{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9d237554-9581-4577-897a-3907d38a0cb3', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9d237554-9581-4577-897a-3907d38a0cb3', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'attached_at': '', 'detached_at': '', 'volume_id': '9d237554-9581-4577-897a-3907d38a0cb3', 'serial': '9d237554-9581-4577-897a-3907d38a0cb3'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '2e3b09a8-1ad6-45d8-9497-37288b706f5c', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.206 225859 WARNING nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.212 225859 DEBUG nova.virt.libvirt.host [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.213 225859 DEBUG nova.virt.libvirt.host [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.216 225859 DEBUG nova.virt.libvirt.host [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.216 225859 DEBUG nova.virt.libvirt.host [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.217 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.217 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.218 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.218 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.218 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.218 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.219 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.219 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.219 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.219 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.220 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.220 225859 DEBUG nova.virt.hardware [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.245 225859 DEBUG nova.storage.rbd_utils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.249 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.509 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/950145876' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.744 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.794 225859 DEBUG nova.virt.libvirt.vif [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1670904176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-1670904176',id=159,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-qovaqapn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:03:47Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=c1db561e-0c8b-4cfb-97bb-55f8d4731b87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.795 225859 DEBUG nova.network.os_vif_util [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.795 225859 DEBUG nova.network.os_vif_util [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.797 225859 DEBUG nova.objects.instance [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'pci_devices' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.820 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <uuid>c1db561e-0c8b-4cfb-97bb-55f8d4731b87</uuid>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <name>instance-0000009f</name>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-1670904176</nova:name>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:04:13</nova:creationTime>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:user uuid="2446e8399b344b29986c1aaf8bf73adf">tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member</nova:user>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:project uuid="63555e5851564db08c6429231d264f2c">tempest-ServerBootFromVolumeStableRescueTest-1871371328</nova:project>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <nova:port uuid="22663aa0-a7f4-431c-b5a9-4433da2dff09">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="serial">c1db561e-0c8b-4cfb-97bb-55f8d4731b87</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="uuid">c1db561e-0c8b-4cfb-97bb-55f8d4731b87</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-9d237554-9581-4577-897a-3907d38a0cb3">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <serial>9d237554-9581-4577-897a-3907d38a0cb3</serial>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:2e:ec:9e"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <target dev="tap22663aa0-a7"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/console.log" append="off"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:04:13 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:04:13 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:04:13 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:04:13 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.821 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Preparing to wait for external event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.821 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.822 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.822 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.822 225859 DEBUG nova.virt.libvirt.vif [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1670904176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-1670904176',id=159,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-qovaqapn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:03:47Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=c1db561e-0c8b-4cfb-97bb-55f8d4731b87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.823 225859 DEBUG nova.network.os_vif_util [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.823 225859 DEBUG nova.network.os_vif_util [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.824 225859 DEBUG os_vif [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.824 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.825 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.825 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.827 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.827 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22663aa0-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.828 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap22663aa0-a7, col_values=(('external_ids', {'iface-id': '22663aa0-a7f4-431c-b5a9-4433da2dff09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:ec:9e', 'vm-uuid': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.829 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:13 np0005588919 NetworkManager[49104]: <info>  [1768921453.8298] manager: (tap22663aa0-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.836 225859 INFO os_vif [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7')#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.946 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.947 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.947 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No VIF found with MAC fa:16:3e:2e:ec:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.947 225859 INFO nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Using config drive#033[00m
Jan 20 10:04:13 np0005588919 nova_compute[225855]: 2026-01-20 15:04:13.970 225859 DEBUG nova.storage.rbd_utils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:14.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 20 10:04:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:14.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.008 225859 INFO nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Creating config drive at /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.013 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvy95zk4h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.143 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvy95zk4h" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.169 225859 DEBUG nova.storage.rbd_utils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.173 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.242519) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455242542, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 430, "num_deletes": 250, "total_data_size": 493335, "memory_usage": 501616, "flush_reason": "Manual Compaction"}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455245688, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 278919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 57714, "largest_seqno": 58139, "table_properties": {"data_size": 276551, "index_size": 468, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6502, "raw_average_key_size": 20, "raw_value_size": 271740, "raw_average_value_size": 854, "num_data_blocks": 21, "num_entries": 318, "num_filter_entries": 318, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921437, "oldest_key_time": 1768921437, "file_creation_time": 1768921455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 3204 microseconds, and 1151 cpu microseconds.
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.245720) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 278919 bytes OK
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.245738) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248143) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248161) EVENT_LOG_v1 {"time_micros": 1768921455248155, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248180) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 490627, prev total WAL file size 490627, number of live WAL files 2.
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248686) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373535' seq:72057594037927935, type:22 .. '6D6772737461740032303036' seq:0, type:0; will stop at (end)
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(272KB)], [111(13MB)]
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455248788, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 14104046, "oldest_snapshot_seqno": -1}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 8340 keys, 10313447 bytes, temperature: kUnknown
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455369998, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10313447, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10260102, "index_size": 31409, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20869, "raw_key_size": 216205, "raw_average_key_size": 25, "raw_value_size": 10113916, "raw_average_value_size": 1212, "num_data_blocks": 1224, "num_entries": 8340, "num_filter_entries": 8340, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921455, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.370250) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10313447 bytes
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.372023) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 116.3 rd, 85.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(87.5) write-amplify(37.0) OK, records in: 8849, records dropped: 509 output_compression: NoCompression
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.372040) EVENT_LOG_v1 {"time_micros": 1768921455372032, "job": 70, "event": "compaction_finished", "compaction_time_micros": 121263, "compaction_time_cpu_micros": 29564, "output_level": 6, "num_output_files": 1, "total_output_size": 10313447, "num_input_records": 8849, "num_output_records": 8340, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455372194, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921455374653, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.248578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.374718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.374723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.374724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.374726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:04:15.374727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:04:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.589 225859 DEBUG oslo_concurrency.processutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.590 225859 INFO nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Deleting local config drive /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config because it was imported into RBD.#033[00m
Jan 20 10:04:15 np0005588919 kernel: tap22663aa0-a7: entered promiscuous mode
Jan 20 10:04:15 np0005588919 NetworkManager[49104]: <info>  [1768921455.6482] manager: (tap22663aa0-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Jan 20 10:04:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:15Z|00658|binding|INFO|Claiming lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 for this chassis.
Jan 20 10:04:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:15Z|00659|binding|INFO|22663aa0-a7f4-431c-b5a9-4433da2dff09: Claiming fa:16:3e:2e:ec:9e 10.100.0.10
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.651 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:15Z|00660|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 ovn-installed in OVS
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.670 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.674 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:15Z|00661|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 up in Southbound
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.676 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.677 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec bound to our chassis#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.679 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:15 np0005588919 systemd-udevd[289182]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:15 np0005588919 systemd-machined[194361]: New machine qemu-77-instance-0000009f.
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.693 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14a84b44-a527-4225-8b4d-ed1026d75d95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 NetworkManager[49104]: <info>  [1768921455.7026] device (tap22663aa0-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:15 np0005588919 NetworkManager[49104]: <info>  [1768921455.7041] device (tap22663aa0-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:15 np0005588919 systemd[1]: Started Virtual Machine qemu-77-instance-0000009f.
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.725 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[46eec561-ca79-4607-91ce-1588f5cef2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.727 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[697850e5-b75b-42ad-92e3-e3ad7fad0b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.754 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[53640314-522d-42e7-96eb-79ce183f01ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.770 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d851606a-5a46-426a-a5c1-3d9e6450a6c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289196, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.783 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[df115565-6db8-44d9-9369-71b35a7dc5ed]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289197, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289197, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.785 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.786 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:15 np0005588919 nova_compute[225855]: 2026-01-20 15:04:15.788 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.789 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:15.790 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:16.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.342 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921456.3416662, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.342 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:16.424 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.436 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.440 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921456.3420632, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.440 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.528 225859 DEBUG nova.compute.manager [req-5cacb33e-42b8-4f8d-be61-9c70f4181505 req-9d7d69e4-b163-4f73-b02d-c2c5cbce688e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.529 225859 DEBUG oslo_concurrency.lockutils [req-5cacb33e-42b8-4f8d-be61-9c70f4181505 req-9d7d69e4-b163-4f73-b02d-c2c5cbce688e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.529 225859 DEBUG oslo_concurrency.lockutils [req-5cacb33e-42b8-4f8d-be61-9c70f4181505 req-9d7d69e4-b163-4f73-b02d-c2c5cbce688e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.529 225859 DEBUG oslo_concurrency.lockutils [req-5cacb33e-42b8-4f8d-be61-9c70f4181505 req-9d7d69e4-b163-4f73-b02d-c2c5cbce688e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.530 225859 DEBUG nova.compute.manager [req-5cacb33e-42b8-4f8d-be61-9c70f4181505 req-9d7d69e4-b163-4f73-b02d-c2c5cbce688e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Processing event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.531 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.532 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.537 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.540 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921456.53564, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.541 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.545 225859 INFO nova.virt.libvirt.driver [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance spawned successfully.#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.546 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.591 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.591 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.592 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.593 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.593 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.593 225859 DEBUG nova.virt.libvirt.driver [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.599 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.602 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.642 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.671 225859 INFO nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Took 3.47 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.671 225859 DEBUG nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:16 np0005588919 nova_compute[225855]: 2026-01-20 15:04:16.729 225859 INFO nova.compute.manager [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Took 30.64 seconds to build instance.#033[00m
Jan 20 10:04:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 20 10:04:17 np0005588919 nova_compute[225855]: 2026-01-20 15:04:17.199 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:17 np0005588919 nova_compute[225855]: 2026-01-20 15:04:17.458 225859 DEBUG oslo_concurrency.lockutils [None req-9f6f69f3-98d5-40ce-bed7-1eadbfe8563e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 31.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:18.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:18.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.758 225859 DEBUG nova.compute.manager [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.759 225859 DEBUG oslo_concurrency.lockutils [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.759 225859 DEBUG oslo_concurrency.lockutils [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.759 225859 DEBUG oslo_concurrency.lockutils [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.760 225859 DEBUG nova.compute.manager [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.760 225859 WARNING nova.compute.manager [req-593c411e-5cd2-4029-84df-63a51fc14b21 req-2511cc59-080a-437c-b22c-20af8e5eac96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:18 np0005588919 nova_compute[225855]: 2026-01-20 15:04:18.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:20.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:20 np0005588919 nova_compute[225855]: 2026-01-20 15:04:20.242 225859 INFO nova.compute.manager [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Rescuing#033[00m
Jan 20 10:04:20 np0005588919 nova_compute[225855]: 2026-01-20 15:04:20.242 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:20 np0005588919 nova_compute[225855]: 2026-01-20 15:04:20.243 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:20 np0005588919 nova_compute[225855]: 2026-01-20 15:04:20.243 225859 DEBUG nova.network.neutron [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:04:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:20.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:22.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:22 np0005588919 nova_compute[225855]: 2026-01-20 15:04:22.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:22.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:22 np0005588919 nova_compute[225855]: 2026-01-20 15:04:22.917 225859 DEBUG nova.network.neutron [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating instance_info_cache with network_info: [{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:22 np0005588919 nova_compute[225855]: 2026-01-20 15:04:22.972 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e352 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:23 np0005588919 nova_compute[225855]: 2026-01-20 15:04:23.641 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:04:23 np0005588919 nova_compute[225855]: 2026-01-20 15:04:23.832 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:24.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:24.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 20 10:04:26 np0005588919 podman[289297]: 2026-01-20 15:04:26.037882476 +0000 UTC m=+0.083066685 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:04:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:26.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:26.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:27 np0005588919 nova_compute[225855]: 2026-01-20 15:04:27.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:28.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:28.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:28 np0005588919 nova_compute[225855]: 2026-01-20 15:04:28.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:30.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:30Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:ec:9e 10.100.0.10
Jan 20 10:04:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:30Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:ec:9e 10.100.0.10
Jan 20 10:04:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:30.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:32.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:32 np0005588919 nova_compute[225855]: 2026-01-20 15:04:32.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:04:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:04:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:32.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:33 np0005588919 nova_compute[225855]: 2026-01-20 15:04:33.690 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:04:33 np0005588919 nova_compute[225855]: 2026-01-20 15:04:33.838 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:34.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:34.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:36.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:36 np0005588919 kernel: tap22663aa0-a7 (unregistering): left promiscuous mode
Jan 20 10:04:36 np0005588919 NetworkManager[49104]: <info>  [1768921476.2500] device (tap22663aa0-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:04:36 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:36Z|00662|binding|INFO|Releasing lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 from this chassis (sb_readonly=0)
Jan 20 10:04:36 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:36Z|00663|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 down in Southbound
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.257 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:36Z|00664|binding|INFO|Removing iface tap22663aa0-a7 ovn-installed in OVS
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.268 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.269 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.272 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.295 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6d12d7f8-b71d-4a86-b219-39a3a6099672]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.322 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[72aa32b0-dae8-4728-91a7-d5a3f795d925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.326 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[04b0ca05-35ea-4dc9-ae3d-2f55f8f3e429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009f.scope: Consumed 13.682s CPU time.
Jan 20 10:04:36 np0005588919 systemd-machined[194361]: Machine qemu-77-instance-0000009f terminated.
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.355 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c11b4ec7-9106-4a47-bdb2-991c6f9a35ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.372 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[58b7452c-c3e2-4330-9405-b99d83d0db03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289472, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.385 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b463cf19-aad8-45bb-9305-4af88fb68f7c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289473, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289473, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.387 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.389 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.392 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.393 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.394 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.394 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:36.394 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:36.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.702 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.708 225859 INFO nova.virt.libvirt.driver [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance destroyed successfully.#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.708 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'numa_topology' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.799 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Attempting a stable device rescue#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.914 225859 DEBUG nova.compute.manager [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.915 225859 DEBUG oslo_concurrency.lockutils [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.915 225859 DEBUG oslo_concurrency.lockutils [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.915 225859 DEBUG oslo_concurrency.lockutils [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.916 225859 DEBUG nova.compute.manager [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:36 np0005588919 nova_compute[225855]: 2026-01-20 15:04:36.916 225859 WARNING nova.compute.manager [req-1c14e547-8f6a-4df8-9f37-5cda5a7abeb1 req-ddf6430c-e623-4953-918f-9d7ef43e199c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.208 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.286 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.290 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.290 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Creating image(s)#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.314 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.318 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'trusted_certs' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.383 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.405 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.409 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "132c88f1a4a6a63e2a2024a3c1506ff21c276bf0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:37 np0005588919 nova_compute[225855]: 2026-01-20 15:04:37.410 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "132c88f1a4a6a63e2a2024a3c1506ff21c276bf0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:38.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.123 225859 DEBUG nova.virt.libvirt.imagebackend [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/7f0d068e-5d2b-485d-b65c-7244508ab6b6/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/7f0d068e-5d2b-485d-b65c-7244508ab6b6/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.270 225859 DEBUG nova.virt.libvirt.imagebackend [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Selected location: {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/7f0d068e-5d2b-485d-b65c-7244508ab6b6/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.272 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] cloning images/7f0d068e-5d2b-485d-b65c-7244508ab6b6@snap to None/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.390 225859 DEBUG oslo_concurrency.lockutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "132c88f1a4a6a63e2a2024a3c1506ff21c276bf0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.437 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'migration_context' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.454 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.457 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start _get_guest_xml network_info=[{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:2e:ec:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '7f0d068e-5d2b-485d-b65c-7244508ab6b6', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-9d237554-9581-4577-897a-3907d38a0cb3', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '9d237554-9581-4577-897a-3907d38a0cb3', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'attached_at': '', 'detached_at': '', 'volume_id': '9d237554-9581-4577-897a-3907d38a0cb3', 'serial': '9d237554-9581-4577-897a-3907d38a0cb3'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '2e3b09a8-1ad6-45d8-9497-37288b706f5c', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.457 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'resources' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.488 225859 WARNING nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.504 225859 DEBUG nova.virt.libvirt.host [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.505 225859 DEBUG nova.virt.libvirt.host [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.516 225859 DEBUG nova.virt.libvirt.host [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.516 225859 DEBUG nova.virt.libvirt.host [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.518 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.518 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.518 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.519 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.519 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.519 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.519 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.520 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.520 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.520 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.520 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.520 225859 DEBUG nova.virt.hardware [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.521 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'vcpu_model' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:38.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.589 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:38.828 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:38.830 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:38 np0005588919 nova_compute[225855]: 2026-01-20 15:04:38.838 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3218525619' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.057 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.077 225859 DEBUG nova.compute.manager [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.077 225859 DEBUG oslo_concurrency.lockutils [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.077 225859 DEBUG oslo_concurrency.lockutils [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.078 225859 DEBUG oslo_concurrency.lockutils [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.078 225859 DEBUG nova.compute.manager [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.078 225859 WARNING nova.compute.manager [req-e89e0823-e6ca-46e1-8a61-9d860589d0f9 req-541487a6-4201-47e8-b49f-1e3d8e6a9fee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.085 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:04:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1912797042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.727 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.643s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.729 225859 DEBUG nova.virt.libvirt.vif [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1670904176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-1670904176',id=159,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:04:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-qovaqapn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:04:16Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=c1db561e-0c8b-4cfb-97bb-55f8d4731b87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:2e:ec:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.730 225859 DEBUG nova.network.os_vif_util [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "vif_mac": "fa:16:3e:2e:ec:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.731 225859 DEBUG nova.network.os_vif_util [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.732 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'pci_devices' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.760 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <uuid>c1db561e-0c8b-4cfb-97bb-55f8d4731b87</uuid>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <name>instance-0000009f</name>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-1670904176</nova:name>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:04:38</nova:creationTime>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:user uuid="2446e8399b344b29986c1aaf8bf73adf">tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member</nova:user>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:project uuid="63555e5851564db08c6429231d264f2c">tempest-ServerBootFromVolumeStableRescueTest-1871371328</nova:project>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <nova:port uuid="22663aa0-a7f4-431c-b5a9-4433da2dff09">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="serial">c1db561e-0c8b-4cfb-97bb-55f8d4731b87</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="uuid">c1db561e-0c8b-4cfb-97bb-55f8d4731b87</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-9d237554-9581-4577-897a-3907d38a0cb3">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <serial>9d237554-9581-4577-897a-3907d38a0cb3</serial>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.rescue">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <boot order="1"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:2e:ec:9e"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <target dev="tap22663aa0-a7"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/console.log" append="off"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:04:39 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:04:39 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:04:39 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:04:39 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.768 225859 INFO nova.virt.libvirt.driver [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance destroyed successfully.#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.829 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.829 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.829 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.830 225859 DEBUG nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] No VIF found with MAC fa:16:3e:2e:ec:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.830 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Using config drive#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.853 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:39 np0005588919 podman[289791]: 2026-01-20 15:04:39.87100559 +0000 UTC m=+0.064550682 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.889 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'ec2_ids' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:39 np0005588919 nova_compute[225855]: 2026-01-20 15:04:39.924 225859 DEBUG nova.objects.instance [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'keypairs' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:40.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:40 np0005588919 nova_compute[225855]: 2026-01-20 15:04:40.649 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Creating config drive at /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue#033[00m
Jan 20 10:04:40 np0005588919 nova_compute[225855]: 2026-01-20 15:04:40.654 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczuypwxj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:40 np0005588919 nova_compute[225855]: 2026-01-20 15:04:40.797 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczuypwxj" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:40 np0005588919 nova_compute[225855]: 2026-01-20 15:04:40.827 225859 DEBUG nova.storage.rbd_utils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] rbd image c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:04:40 np0005588919 nova_compute[225855]: 2026-01-20 15:04:40.831 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.007 225859 DEBUG oslo_concurrency.processutils [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue c1db561e-0c8b-4cfb-97bb-55f8d4731b87_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.007 225859 INFO nova.virt.libvirt.driver [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Deleting local config drive /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:04:41 np0005588919 kernel: tap22663aa0-a7: entered promiscuous mode
Jan 20 10:04:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:41Z|00665|binding|INFO|Claiming lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 for this chassis.
Jan 20 10:04:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:41Z|00666|binding|INFO|22663aa0-a7f4-431c-b5a9-4433da2dff09: Claiming fa:16:3e:2e:ec:9e 10.100.0.10
Jan 20 10:04:41 np0005588919 NetworkManager[49104]: <info>  [1768921481.0575] manager: (tap22663aa0-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.056 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:41Z|00667|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 ovn-installed in OVS
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:41Z|00668|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 up in Southbound
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.078 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.079 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec bound to our chassis#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.081 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.081 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:41 np0005588919 systemd-udevd[289879]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.097 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d3c5dc-7ec4-4ae0-ac7f-97026790497e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 systemd-machined[194361]: New machine qemu-78-instance-0000009f.
Jan 20 10:04:41 np0005588919 NetworkManager[49104]: <info>  [1768921481.1031] device (tap22663aa0-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:41 np0005588919 NetworkManager[49104]: <info>  [1768921481.1039] device (tap22663aa0-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:41 np0005588919 systemd[1]: Started Virtual Machine qemu-78-instance-0000009f.
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.131 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0280643d-c3e9-4448-b83d-b1fefcc47c93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.136 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3189c29a-9f92-42d4-bbfd-aaf06066af3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.163 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4e2d02-ac89-4db5-96b6-b96835e79aaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[035f5a6d-aee7-4df9-8efa-f9f4ff085b3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 9, 'rx_bytes': 532, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 9, 'rx_bytes': 532, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289892, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.199 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a2364a94-4972-471a-8fc6-8d141c72fc63]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289894, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289894, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.201 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.204 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.204 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:41.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.666 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for c1db561e-0c8b-4cfb-97bb-55f8d4731b87 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.668 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921481.665639, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.668 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.677 225859 DEBUG nova.compute.manager [None req-b0daa2a6-7c5c-4d0f-9607-8a8fbe9e85a0 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.722 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.726 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.755 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.756 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921481.6660094, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.756 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.812 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:41 np0005588919 nova_compute[225855]: 2026-01-20 15:04:41.816 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:42.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.329 225859 DEBUG nova.compute.manager [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.330 225859 DEBUG oslo_concurrency.lockutils [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.331 225859 DEBUG oslo_concurrency.lockutils [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.331 225859 DEBUG oslo_concurrency.lockutils [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.331 225859 DEBUG nova.compute.manager [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:42 np0005588919 nova_compute[225855]: 2026-01-20 15:04:42.331 225859 WARNING nova.compute.manager [req-fe92bba3-7e00-4163-b400-ede3e41863e4 req-9cff9c6f-ae26-433d-8db1-e5a1a6137130 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:04:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:42.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:43 np0005588919 nova_compute[225855]: 2026-01-20 15:04:43.840 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:44.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.403 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.404 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.515 225859 DEBUG nova.compute.manager [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.517 225859 DEBUG oslo_concurrency.lockutils [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.517 225859 DEBUG oslo_concurrency.lockutils [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.517 225859 DEBUG oslo_concurrency.lockutils [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.518 225859 DEBUG nova.compute.manager [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:44 np0005588919 nova_compute[225855]: 2026-01-20 15:04:44.518 225859 WARNING nova.compute.manager [req-0d3777e8-a9ea-4f8a-8a76-b2b5aa3c1f40 req-d260426f-7eec-42b0-b134-23d162b144d4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:04:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:44.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:44.832 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:45 np0005588919 nova_compute[225855]: 2026-01-20 15:04:45.104 225859 INFO nova.compute.manager [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Unrescuing#033[00m
Jan 20 10:04:45 np0005588919 nova_compute[225855]: 2026-01-20 15:04:45.105 225859 DEBUG oslo_concurrency.lockutils [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:45 np0005588919 nova_compute[225855]: 2026-01-20 15:04:45.105 225859 DEBUG oslo_concurrency.lockutils [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquired lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:45 np0005588919 nova_compute[225855]: 2026-01-20 15:04:45.105 225859 DEBUG nova.network.neutron [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:04:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:46.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:46 np0005588919 nova_compute[225855]: 2026-01-20 15:04:46.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 20 10:04:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:46.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:47 np0005588919 nova_compute[225855]: 2026-01-20 15:04:47.212 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:47 np0005588919 nova_compute[225855]: 2026-01-20 15:04:47.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:48.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:48 np0005588919 nova_compute[225855]: 2026-01-20 15:04:48.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:48.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:48 np0005588919 nova_compute[225855]: 2026-01-20 15:04:48.844 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.579 225859 DEBUG nova.network.neutron [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating instance_info_cache with network_info: [{"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.637 225859 DEBUG oslo_concurrency.lockutils [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Releasing lock "refresh_cache-c1db561e-0c8b-4cfb-97bb-55f8d4731b87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.638 225859 DEBUG nova.objects.instance [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'flavor' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:49 np0005588919 kernel: tap22663aa0-a7 (unregistering): left promiscuous mode
Jan 20 10:04:49 np0005588919 NetworkManager[49104]: <info>  [1768921489.7742] device (tap22663aa0-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.781 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:49Z|00669|binding|INFO|Releasing lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 from this chassis (sb_readonly=0)
Jan 20 10:04:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:49Z|00670|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 down in Southbound
Jan 20 10:04:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:49Z|00671|binding|INFO|Removing iface tap22663aa0-a7 ovn-installed in OVS
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.801 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.822 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.823 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.825 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:49 np0005588919 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 20 10:04:49 np0005588919 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009f.scope: Consumed 8.934s CPU time.
Jan 20 10:04:49 np0005588919 systemd-machined[194361]: Machine qemu-78-instance-0000009f terminated.
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.842 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7d478294-3839-46f8-a893-125bc908d6ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.873 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e16032-8a1f-4a46-8fbf-5e883b325846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.878 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e70a0e25-5834-4b2e-8683-15cea42b974f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.905 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5e74f77e-676c-4090-9f4f-68fc33f90579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.931 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4035c115-9c83-4ef3-be3f-49db93cf76ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 11, 'rx_bytes': 532, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 11, 'rx_bytes': 532, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289971, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.954 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4bd608-1131-4728-a579-d110351a63f5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289974, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289974, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.956 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.962 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.961 225859 INFO nova.virt.libvirt.driver [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance destroyed successfully.#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.961 225859 DEBUG nova.objects.instance [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'numa_topology' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.962 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.962 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:49.963 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:49 np0005588919 nova_compute[225855]: 2026-01-20 15:04:49.963 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:50.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:50 np0005588919 kernel: tap22663aa0-a7: entered promiscuous mode
Jan 20 10:04:50 np0005588919 systemd-udevd[289962]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:04:50 np0005588919 NetworkManager[49104]: <info>  [1768921490.1803] manager: (tap22663aa0-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Jan 20 10:04:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:50Z|00672|binding|INFO|Claiming lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 for this chassis.
Jan 20 10:04:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:50Z|00673|binding|INFO|22663aa0-a7f4-431c-b5a9-4433da2dff09: Claiming fa:16:3e:2e:ec:9e 10.100.0.10
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.180 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588919 NetworkManager[49104]: <info>  [1768921490.1903] device (tap22663aa0-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:04:50 np0005588919 NetworkManager[49104]: <info>  [1768921490.1910] device (tap22663aa0-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:04:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:50Z|00674|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 ovn-installed in OVS
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.199 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.199 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.199 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.199 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.201 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588919 systemd-machined[194361]: New machine qemu-79-instance-0000009f.
Jan 20 10:04:50 np0005588919 systemd[1]: Started Virtual Machine qemu-79-instance-0000009f.
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.346 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:50Z|00675|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 up in Southbound
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.348 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec bound to our chassis#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.349 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.363 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7806279d-da7b-4a6b-9088-aac96a7f29eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.396 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d47c37ba-5f9f-41a8-bd48-0218b4a063bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.399 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad1a253-4891-43fc-932e-33c44affd2ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.423 225859 DEBUG nova.compute.manager [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.424 225859 DEBUG oslo_concurrency.lockutils [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.424 225859 DEBUG oslo_concurrency.lockutils [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.425 225859 DEBUG oslo_concurrency.lockutils [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.425 225859 DEBUG nova.compute.manager [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.425 225859 WARNING nova.compute.manager [req-c15dff0f-234c-4e23-9290-c6a4e8977476 req-5ed0313a-c8ae-4859-b39f-c33d37781ff4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.431 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7ea5a0-60c4-46ca-8f14-4805ebb31171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.447 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[833de69c-2d3b-42c2-89fb-51e61b7f9468]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 13, 'rx_bytes': 532, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 13, 'rx_bytes': 532, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290004, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.471 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0e4051-9f4f-45c9-b2b2-95e804491e49]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290005, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290005, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.472 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.475 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.475 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.476 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:50.476 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:04:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:50.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.682 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for c1db561e-0c8b-4cfb-97bb-55f8d4731b87 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.682 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921490.6819882, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.682 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.729 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.734 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.776 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.776 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921490.6841216, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.776 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Started (Lifecycle Event)#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.799 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.803 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:04:50 np0005588919 nova_compute[225855]: 2026-01-20 15:04:50.854 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:04:51 np0005588919 nova_compute[225855]: 2026-01-20 15:04:51.044 225859 DEBUG nova.compute.manager [None req-06ed0cbf-d946-4a7b-81ed-059b0fd3853e 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:04:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:52.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.214 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:52.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.700 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.701 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.701 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.701 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.702 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.702 225859 WARNING nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.702 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.703 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.703 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.703 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.703 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.704 225859 WARNING nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.704 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.704 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.704 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.705 225859 DEBUG oslo_concurrency.lockutils [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.705 225859 DEBUG nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:52 np0005588919 nova_compute[225855]: 2026-01-20 15:04:52.705 225859 WARNING nova.compute.manager [req-76a19a51-5bb1-4e29-b020-777903b3b7e1 req-d500da47-6b0a-482c-a946-31c3b5c4f34c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:04:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.012 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.013 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.014 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.014 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.015 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.016 225859 INFO nova.compute.manager [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Terminating instance#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.018 225859 DEBUG nova.compute.manager [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:04:53 np0005588919 kernel: tap22663aa0-a7 (unregistering): left promiscuous mode
Jan 20 10:04:53 np0005588919 NetworkManager[49104]: <info>  [1768921493.0698] device (tap22663aa0-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:04:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:53Z|00676|binding|INFO|Releasing lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 from this chassis (sb_readonly=0)
Jan 20 10:04:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:53Z|00677|binding|INFO|Setting lport 22663aa0-a7f4-431c-b5a9-4433da2dff09 down in Southbound
Jan 20 10:04:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:04:53Z|00678|binding|INFO|Removing iface tap22663aa0-a7 ovn-installed in OVS
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.094 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Jan 20 10:04:53 np0005588919 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009f.scope: Consumed 2.941s CPU time.
Jan 20 10:04:53 np0005588919 systemd-machined[194361]: Machine qemu-79-instance-0000009f terminated.
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.263 225859 INFO nova.virt.libvirt.driver [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Instance destroyed successfully.#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.264 225859 DEBUG nova.objects.instance [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'resources' on Instance uuid c1db561e-0c8b-4cfb-97bb-55f8d4731b87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.269 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:ec:9e 10.100.0.10'], port_security=['fa:16:3e:2e:ec:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c1db561e-0c8b-4cfb-97bb-55f8d4731b87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=22663aa0-a7f4-431c-b5a9-4433da2dff09) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.270 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 22663aa0-a7f4-431c-b5a9-4433da2dff09 in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.271 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.285 225859 DEBUG nova.virt.libvirt.vif [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:03:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1670904176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-1670904176',id=159,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-qovaqapn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:04:51Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=c1db561e-0c8b-4cfb-97bb-55f8d4731b87,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.285 225859 DEBUG nova.network.os_vif_util [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "address": "fa:16:3e:2e:ec:9e", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22663aa0-a7", "ovs_interfaceid": "22663aa0-a7f4-431c-b5a9-4433da2dff09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.286 225859 DEBUG nova.network.os_vif_util [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.286 225859 DEBUG os_vif [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.288 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.288 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22663aa0-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.290 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.291 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3757cdd5-4f79-4fbb-9658-6b15b462645b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.292 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.294 225859 INFO os_vif [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:ec:9e,bridge_name='br-int',has_traffic_filtering=True,id=22663aa0-a7f4-431c-b5a9-4433da2dff09,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22663aa0-a7')#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.319 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[735b1d35-28f0-49aa-b18f-b124c7de9ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.322 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c9414c-a0c5-4291-adf5-3c638fe2ce2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.350 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[52adab2b-dd6b-4da9-b098-57ebd4177e90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.371 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c79c956-8d9b-4f46-b81b-149736630759]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap671e28d0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:4e:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 15, 'rx_bytes': 532, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 15, 'rx_bytes': 532, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642759, 'reachable_time': 32213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290108, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.388 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[37ca0fce-47d3-4f6b-bae7-dc1213df13b1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642770, 'tstamp': 642770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290109, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap671e28d0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642773, 'tstamp': 642773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290109, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.389 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.392 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.392 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap671e28d0-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.392 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.392 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap671e28d0-00, col_values=(('external_ids', {'iface-id': 'a8628d9e-196f-4b84-89fd-d3a41792b8a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:04:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:04:53.393 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.398 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [{"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.490 225859 INFO nova.virt.libvirt.driver [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Deleting instance files /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_del#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.491 225859 INFO nova.virt.libvirt.driver [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Deletion of /var/lib/nova/instances/c1db561e-0c8b-4cfb-97bb-55f8d4731b87_del complete#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.508 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-474cec75-3b01-411a-9074-75859d2a9ddf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.509 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.509 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.798 225859 INFO nova.compute.manager [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.799 225859 DEBUG oslo.service.loopingcall [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.799 225859 DEBUG nova.compute.manager [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:04:53 np0005588919 nova_compute[225855]: 2026-01-20 15:04:53.799 225859 DEBUG nova.network.neutron [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:04:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:54.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:04:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:54.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.870 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.871 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.871 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.872 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:04:54 np0005588919 nova_compute[225855]: 2026-01-20 15:04:54.872 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.320 225859 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.320 225859 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.321 225859 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.321 225859 DEBUG oslo_concurrency.lockutils [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.321 225859 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.321 225859 DEBUG nova.compute.manager [req-1e46f14a-5589-42a6-8996-75241031ac4c req-9593386c-83c8-4ecf-9426-67566f273212 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-unplugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:04:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:55 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1672774260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.378 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.544 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.545 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-00000096 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.705 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.706 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4237MB free_disk=20.830379486083984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.706 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:55 np0005588919 nova_compute[225855]: 2026-01-20 15:04:55.706 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:56.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.355 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 474cec75-3b01-411a-9074-75859d2a9ddf actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.355 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance c1db561e-0c8b-4cfb-97bb-55f8d4731b87 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.355 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.356 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.524 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:56.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:56 np0005588919 podman[290180]: 2026-01-20 15:04:56.8158216 +0000 UTC m=+0.099043255 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:04:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2547310442' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:56 np0005588919 nova_compute[225855]: 2026-01-20 15:04:56.993 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.005 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.054 225859 DEBUG nova.network.neutron [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.058 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.105 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.106 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.119 225859 INFO nova.compute.manager [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Took 3.32 seconds to deallocate network for instance.#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.502 225859 DEBUG nova.compute.manager [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.502 225859 DEBUG oslo_concurrency.lockutils [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.503 225859 DEBUG oslo_concurrency.lockutils [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.503 225859 DEBUG oslo_concurrency.lockutils [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.503 225859 DEBUG nova.compute.manager [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] No waiting events found dispatching network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.503 225859 WARNING nova.compute.manager [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received unexpected event network-vif-plugged-22663aa0-a7f4-431c-b5a9-4433da2dff09 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.503 225859 DEBUG nova.compute.manager [req-cbf0bcec-d0ea-487f-aadb-10f340d606d0 req-5c866872-954c-4508-a6c1-10fbbdc4edd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Received event network-vif-deleted-22663aa0-a7f4-431c-b5a9-4433da2dff09 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.691 225859 INFO nova.compute.manager [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Took 0.57 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.744 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.745 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:04:57 np0005588919 nova_compute[225855]: 2026-01-20 15:04:57.877 225859 DEBUG oslo_concurrency.processutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:04:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:04:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:04:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:04:58.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:04:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:04:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2668567201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.361 225859 DEBUG oslo_concurrency.processutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.366 225859 DEBUG nova.compute.provider_tree [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.390 225859 DEBUG nova.scheduler.client.report [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.448 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:04:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:04:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:04:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:04:58.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:04:58 np0005588919 nova_compute[225855]: 2026-01-20 15:04:58.601 225859 INFO nova.scheduler.client.report [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Deleted allocations for instance c1db561e-0c8b-4cfb-97bb-55f8d4731b87#033[00m
Jan 20 10:04:59 np0005588919 nova_compute[225855]: 2026-01-20 15:04:59.098 225859 DEBUG oslo_concurrency.lockutils [None req-c4974407-f15e-4815-90b7-11ad883cd1a8 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "c1db561e-0c8b-4cfb-97bb-55f8d4731b87" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:00.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:00 np0005588919 nova_compute[225855]: 2026-01-20 15:05:00.101 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:00 np0005588919 nova_compute[225855]: 2026-01-20 15:05:00.101 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 20 10:05:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:00.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:02.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:02 np0005588919 nova_compute[225855]: 2026-01-20 15:05:02.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:02.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:03 np0005588919 nova_compute[225855]: 2026-01-20 15:05:03.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:04.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:04.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:06.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:06.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.641 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.642 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.642 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.642 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.642 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.644 225859 INFO nova.compute.manager [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Terminating instance#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.645 225859 DEBUG nova.compute.manager [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:05:06 np0005588919 kernel: tap244332ba-1b (unregistering): left promiscuous mode
Jan 20 10:05:06 np0005588919 NetworkManager[49104]: <info>  [1768921506.7547] device (tap244332ba-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:06Z|00679|binding|INFO|Releasing lport 244332ba-1b58-4d42-98b0-245f9460c50f from this chassis (sb_readonly=0)
Jan 20 10:05:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:06Z|00680|binding|INFO|Setting lport 244332ba-1b58-4d42-98b0-245f9460c50f down in Southbound
Jan 20 10:05:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:06Z|00681|binding|INFO|Removing iface tap244332ba-1b ovn-installed in OVS
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.764 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.784 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 20 10:05:06 np0005588919 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000096.scope: Consumed 1.842s CPU time.
Jan 20 10:05:06 np0005588919 systemd-machined[194361]: Machine qemu-76-instance-00000096 terminated.
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.878 225859 INFO nova.virt.libvirt.driver [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Instance destroyed successfully.#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.879 225859 DEBUG nova.objects.instance [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lazy-loading 'resources' on Instance uuid 474cec75-3b01-411a-9074-75859d2a9ddf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:06.911 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:36:24 10.100.0.4'], port_security=['fa:16:3e:6f:36:24 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '474cec75-3b01-411a-9074-75859d2a9ddf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63555e5851564db08c6429231d264f2c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7e54c470-6a6f-454e-ae01-9d2d59b2c74d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=248fa32c-94be-4e1b-b4d3-cb9fac0ec155, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=244332ba-1b58-4d42-98b0-245f9460c50f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:06.912 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 244332ba-1b58-4d42-98b0-245f9460c50f in datapath 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec unbound from our chassis#033[00m
Jan 20 10:05:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:06.914 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:05:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:06.915 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6cac15-53dd-42a0-b1c1-33f2a5a12ec3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:06.915 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec namespace which is not needed anymore#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.917 225859 DEBUG nova.virt.libvirt.vif [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-254746207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-254746207',id=150,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:03:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63555e5851564db08c6429231d264f2c',ramdisk_id='',reservation_id='r-solng1yz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1871371328-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:03:09Z,user_data=None,user_id='2446e8399b344b29986c1aaf8bf73adf',uuid=474cec75-3b01-411a-9074-75859d2a9ddf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.917 225859 DEBUG nova.network.os_vif_util [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converting VIF {"id": "244332ba-1b58-4d42-98b0-245f9460c50f", "address": "fa:16:3e:6f:36:24", "network": {"id": "671e28d0-0b9e-41e0-b5e0-db1ccd4717ec", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-884777184-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63555e5851564db08c6429231d264f2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap244332ba-1b", "ovs_interfaceid": "244332ba-1b58-4d42-98b0-245f9460c50f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.918 225859 DEBUG nova.network.os_vif_util [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.918 225859 DEBUG os_vif [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.920 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap244332ba-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.921 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:06 np0005588919 nova_compute[225855]: 2026-01-20 15:05:06.925 225859 INFO os_vif [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:36:24,bridge_name='br-int',has_traffic_filtering=True,id=244332ba-1b58-4d42-98b0-245f9460c50f,network=Network(671e28d0-0b9e-41e0-b5e0-db1ccd4717ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap244332ba-1b')#033[00m
Jan 20 10:05:07 np0005588919 nova_compute[225855]: 2026-01-20 15:05:07.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [NOTICE]   (288401) : haproxy version is 2.8.14-c23fe91
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [NOTICE]   (288401) : path to executable is /usr/sbin/haproxy
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [WARNING]  (288401) : Exiting Master process...
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [WARNING]  (288401) : Exiting Master process...
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [ALERT]    (288401) : Current worker (288403) exited with code 143 (Terminated)
Jan 20 10:05:07 np0005588919 neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec[288392]: [WARNING]  (288401) : All workers exited. Exiting... (0)
Jan 20 10:05:07 np0005588919 systemd[1]: libpod-791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e.scope: Deactivated successfully.
Jan 20 10:05:07 np0005588919 podman[290311]: 2026-01-20 15:05:07.387924778 +0000 UTC m=+0.371769319 container died 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:05:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9a9b0c9e2a81eedae01229880b31ef78cadc4986b7f57eb0b4c053b0733a83f7-merged.mount: Deactivated successfully.
Jan 20 10:05:07 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e-userdata-shm.mount: Deactivated successfully.
Jan 20 10:05:07 np0005588919 podman[290311]: 2026-01-20 15:05:07.690033171 +0000 UTC m=+0.673877712 container cleanup 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:05:07 np0005588919 systemd[1]: libpod-conmon-791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e.scope: Deactivated successfully.
Jan 20 10:05:07 np0005588919 podman[290343]: 2026-01-20 15:05:07.915701878 +0000 UTC m=+0.206228459 container remove 791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.921 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[930bdca2-9146-42c5-b6e5-3e9a047c1a5e]: (4, ('Tue Jan 20 03:05:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e)\n791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e\nTue Jan 20 03:05:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec (791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e)\n791837c27285c68cb685dfa82b8474123f452a4f4d63910113cfbdc8f564544e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.923 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1a50cb63-198b-4301-9a28-9ca0f4ed76c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.924 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap671e28d0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:07 np0005588919 nova_compute[225855]: 2026-01-20 15:05:07.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588919 kernel: tap671e28d0-00: left promiscuous mode
Jan 20 10:05:07 np0005588919 nova_compute[225855]: 2026-01-20 15:05:07.940 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.943 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b37b3464-9155-4d12-865f-13079a4ec34e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.966 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[487c3fb9-a856-42fb-b63e-038ab23a58bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.967 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aa79f33b-b49b-4354-9ac6-d106c7bba49a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.982 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41271373-5ba2-4734-81ab-a4e05a69d876]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642751, 'reachable_time': 33787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290359, 'error': None, 'target': 'ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.984 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-671e28d0-0b9e-41e0-b5e0-db1ccd4717ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:05:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:07.984 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[69982be6-ec57-4487-b2f4-6b352544f001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:07 np0005588919 systemd[1]: run-netns-ovnmeta\x2d671e28d0\x2d0b9e\x2d41e0\x2db5e0\x2ddb1ccd4717ec.mount: Deactivated successfully.
Jan 20 10:05:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:08.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.262 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921493.2610023, c1db561e-0c8b-4cfb-97bb-55f8d4731b87 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.262 225859 INFO nova.compute.manager [-] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.413 225859 DEBUG nova.compute.manager [None req-46b383e3-2bfb-481c-bda9-895b17668af2 - - - - - -] [instance: c1db561e-0c8b-4cfb-97bb-55f8d4731b87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:08.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.843 225859 INFO nova.virt.libvirt.driver [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Deleting instance files /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf_del#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.844 225859 INFO nova.virt.libvirt.driver [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Deletion of /var/lib/nova/instances/474cec75-3b01-411a-9074-75859d2a9ddf_del complete#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.939 225859 INFO nova.compute.manager [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Took 2.29 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.940 225859 DEBUG oslo.service.loopingcall [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.940 225859 DEBUG nova.compute.manager [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:05:08 np0005588919 nova_compute[225855]: 2026-01-20 15:05:08.941 225859 DEBUG nova.network.neutron [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.761 225859 DEBUG nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.761 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.762 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.762 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.762 225859 DEBUG nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.762 225859 DEBUG nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-unplugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.762 225859 DEBUG nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.763 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.763 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.763 225859 DEBUG oslo_concurrency.lockutils [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.763 225859 DEBUG nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] No waiting events found dispatching network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:09 np0005588919 nova_compute[225855]: 2026-01-20 15:05:09.764 225859 WARNING nova.compute.manager [req-3ae0ac1f-18f0-4c98-9981-9895698cdca3 req-63ad801d-8872-47c1-906e-0b2e118e5358 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received unexpected event network-vif-plugged-244332ba-1b58-4d42-98b0-245f9460c50f for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:05:10 np0005588919 podman[290362]: 2026-01-20 15:05:10.022698341 +0000 UTC m=+0.060870759 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:05:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:10.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.280 225859 DEBUG nova.network.neutron [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.341 225859 INFO nova.compute.manager [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Took 1.40 seconds to deallocate network for instance.#033[00m
Jan 20 10:05:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 20 10:05:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:10.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.720 225859 INFO nova.compute.manager [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Took 0.38 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.807 225859 DEBUG nova.compute.manager [req-a8b5f4c0-2810-44ec-a395-79d27527a7b8 req-1dc026ef-cd92-4298-ae18-b7c0a8c3b387 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Received event network-vif-deleted-244332ba-1b58-4d42-98b0-245f9460c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.811 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.812 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:10 np0005588919 nova_compute[225855]: 2026-01-20 15:05:10.883 225859 DEBUG oslo_concurrency.processutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1033225945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.306 225859 DEBUG oslo_concurrency.processutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.313 225859 DEBUG nova.compute.provider_tree [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.347 225859 DEBUG nova.scheduler.client.report [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.401 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.483 225859 INFO nova.scheduler.client.report [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Deleted allocations for instance 474cec75-3b01-411a-9074-75859d2a9ddf#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:11 np0005588919 nova_compute[225855]: 2026-01-20 15:05:11.959 225859 DEBUG oslo_concurrency.lockutils [None req-34cd6af6-9069-4587-949a-6da0bc77cc0c 2446e8399b344b29986c1aaf8bf73adf 63555e5851564db08c6429231d264f2c - - default default] Lock "474cec75-3b01-411a-9074-75859d2a9ddf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:12.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:12 np0005588919 nova_compute[225855]: 2026-01-20 15:05:12.223 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 20 10:05:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:14.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:14.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:16.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:16 np0005588919 nova_compute[225855]: 2026-01-20 15:05:16.926 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:17 np0005588919 nova_compute[225855]: 2026-01-20 15:05:17.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:18.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:18.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:19 np0005588919 nova_compute[225855]: 2026-01-20 15:05:19.933 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:19 np0005588919 nova_compute[225855]: 2026-01-20 15:05:19.934 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:19 np0005588919 nova_compute[225855]: 2026-01-20 15:05:19.960 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:05:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:20.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.140 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.141 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.151 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.152 225859 INFO nova.compute.claims [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.377 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:20.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1800558050' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.826 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.833 225859 DEBUG nova.compute.provider_tree [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.856 225859 DEBUG nova.scheduler.client.report [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.892 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.894 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.960 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.961 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:05:20 np0005588919 nova_compute[225855]: 2026-01-20 15:05:20.981 225859 INFO nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.002 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.130 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.132 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.133 225859 INFO nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Creating image(s)#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.164 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.195 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.219 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.223 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.286 225859 DEBUG nova.policy [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc554998e71a4322bdd27ac727a9044c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e142d118583b4f9ba3531bcf3838e256', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.289 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.290 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.290 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.291 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.314 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.318 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.878 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921506.8769155, 474cec75-3b01-411a-9074-75859d2a9ddf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.879 225859 INFO nova.compute.manager [-] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.910 225859 DEBUG nova.compute.manager [None req-6475071c-9006-46db-8395-65b257e850e3 - - - - - -] [instance: 474cec75-3b01-411a-9074-75859d2a9ddf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:21 np0005588919 nova_compute[225855]: 2026-01-20 15:05:21.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:22.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.256 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.336 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.404 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] resizing rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:05:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.658 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Successfully created port: 070862f1-1db2-45c2-9787-752e6d88449a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.754 225859 DEBUG nova.objects.instance [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'migration_context' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.768 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.768 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Ensure instance console log exists: /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.769 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.769 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:22 np0005588919 nova_compute[225855]: 2026-01-20 15:05:22.769 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.095 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Successfully updated port: 070862f1-1db2-45c2-9787-752e6d88449a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.114 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.114 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquired lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.115 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:05:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:24.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.286 225859 DEBUG nova.compute.manager [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-changed-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.287 225859 DEBUG nova.compute.manager [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing instance network info cache due to event network-changed-070862f1-1db2-45c2-9787-752e6d88449a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.287 225859 DEBUG oslo_concurrency.lockutils [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:24 np0005588919 nova_compute[225855]: 2026-01-20 15:05:24.525 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:05:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:24.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.020 225859 DEBUG nova.network.neutron [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:26.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:26.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.638 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Releasing lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.639 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Instance network_info: |[{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.639 225859 DEBUG oslo_concurrency.lockutils [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.639 225859 DEBUG nova.network.neutron [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.642 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Start _get_guest_xml network_info=[{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.645 225859 WARNING nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.650 225859 DEBUG nova.virt.libvirt.host [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.650 225859 DEBUG nova.virt.libvirt.host [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.653 225859 DEBUG nova.virt.libvirt.host [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.653 225859 DEBUG nova.virt.libvirt.host [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.654 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.655 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.656 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.656 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.656 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.656 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.657 225859 DEBUG nova.virt.hardware [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.659 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:26 np0005588919 nova_compute[225855]: 2026-01-20 15:05:26.933 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 podman[290673]: 2026-01-20 15:05:27.037959855 +0000 UTC m=+0.083163601 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:05:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2898267171' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.113 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.143 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.148 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3990956425' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.594 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.595 225859 DEBUG nova.virt.libvirt.vif [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-2111868448',display_name='tempest-TestStampPattern-server-2111868448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-2111868448',id=161,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHW99EAKkcMHbb6foGeGxm9beD/C9AeSuQLW3fqIuoocya0hep1/utcjh4cUxZzvt5K+5yMQG3K45jiLKihqKM6cawBqTQvgzcywKN5pk06AjS3tvq9GuiAvDAys6caVkA==',key_name='tempest-TestStampPattern-1928143162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e142d118583b4f9ba3531bcf3838e256',ramdisk_id='',reservation_id='r-7ei3hy41',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-487600181',owner_user_name='tempest-TestStampPattern-487600181-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:05:21Z,user_data=None,user_id='bc554998e71a4322bdd27ac727a9044c',uuid=33ba7a73-3233-40a3-a49a-e5bbd604dc3c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.596 225859 DEBUG nova.network.os_vif_util [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converting VIF {"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.597 225859 DEBUG nova.network.os_vif_util [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.598 225859 DEBUG nova.objects.instance [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.615 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <uuid>33ba7a73-3233-40a3-a49a-e5bbd604dc3c</uuid>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <name>instance-000000a1</name>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestStampPattern-server-2111868448</nova:name>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:05:26</nova:creationTime>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:user uuid="bc554998e71a4322bdd27ac727a9044c">tempest-TestStampPattern-487600181-project-member</nova:user>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:project uuid="e142d118583b4f9ba3531bcf3838e256">tempest-TestStampPattern-487600181</nova:project>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <nova:port uuid="070862f1-1db2-45c2-9787-752e6d88449a">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="serial">33ba7a73-3233-40a3-a49a-e5bbd604dc3c</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="uuid">33ba7a73-3233-40a3-a49a-e5bbd604dc3c</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:e5:e7:09"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <target dev="tap070862f1-1d"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/console.log" append="off"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:05:27 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:05:27 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:05:27 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:05:27 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.617 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Preparing to wait for external event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.617 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.617 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.618 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.618 225859 DEBUG nova.virt.libvirt.vif [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-2111868448',display_name='tempest-TestStampPattern-server-2111868448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-2111868448',id=161,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHW99EAKkcMHbb6foGeGxm9beD/C9AeSuQLW3fqIuoocya0hep1/utcjh4cUxZzvt5K+5yMQG3K45jiLKihqKM6cawBqTQvgzcywKN5pk06AjS3tvq9GuiAvDAys6caVkA==',key_name='tempest-TestStampPattern-1928143162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e142d118583b4f9ba3531bcf3838e256',ramdisk_id='',reservation_id='r-7ei3hy41',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-487600181',owner_user_name='tempest-TestStampPattern-487600181-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:05:21Z,user_data=None,user_id='bc554998e71a4322bdd27ac727a9044c',uuid=33ba7a73-3233-40a3-a49a-e5bbd604dc3c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.619 225859 DEBUG nova.network.os_vif_util [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converting VIF {"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.619 225859 DEBUG nova.network.os_vif_util [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.620 225859 DEBUG os_vif [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.620 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.621 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.624 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.625 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap070862f1-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.625 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap070862f1-1d, col_values=(('external_ids', {'iface-id': '070862f1-1db2-45c2-9787-752e6d88449a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:e7:09', 'vm-uuid': '33ba7a73-3233-40a3-a49a-e5bbd604dc3c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.639 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 NetworkManager[49104]: <info>  [1768921527.6400] manager: (tap070862f1-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.644 225859 INFO os_vif [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d')#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.738 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.738 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.739 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No VIF found with MAC fa:16:3e:e5:e7:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.739 225859 INFO nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Using config drive#033[00m
Jan 20 10:05:27 np0005588919 nova_compute[225855]: 2026-01-20 15:05:27.768 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:28.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:28.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:28 np0005588919 nova_compute[225855]: 2026-01-20 15:05:28.977 225859 INFO nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Creating config drive at /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config#033[00m
Jan 20 10:05:28 np0005588919 nova_compute[225855]: 2026-01-20 15:05:28.983 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_r57kcm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.115 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_r57kcm" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.299 225859 DEBUG nova.storage.rbd_utils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] rbd image 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.304 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.574 225859 DEBUG oslo_concurrency.processutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config 33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.575 225859 INFO nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Deleting local config drive /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:05:29 np0005588919 kernel: tap070862f1-1d: entered promiscuous mode
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.6277] manager: (tap070862f1-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Jan 20 10:05:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:29Z|00682|binding|INFO|Claiming lport 070862f1-1db2-45c2-9787-752e6d88449a for this chassis.
Jan 20 10:05:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:29Z|00683|binding|INFO|070862f1-1db2-45c2-9787-752e6d88449a: Claiming fa:16:3e:e5:e7:09 10.100.0.7
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.628 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.641 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:e7:09 10.100.0.7'], port_security=['fa:16:3e:e5:e7:09 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33ba7a73-3233-40a3-a49a-e5bbd604dc3c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e142d118583b4f9ba3531bcf3838e256', 'neutron:revision_number': '2', 'neutron:security_group_ids': '37efc868-18af-48b7-8d56-e37fd1ec4df0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f9deb561-4473-4aa7-8b6f-d70e20e7cf6d, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=070862f1-1db2-45c2-9787-752e6d88449a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.642 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 070862f1-1db2-45c2-9787-752e6d88449a in datapath 8472bae1-476b-4100-b9fa-e8827bc4f7bf bound to our chassis#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.644 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8472bae1-476b-4100-b9fa-e8827bc4f7bf#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.657 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd01cc5-89dd-495d-b639-f5c8a8a23ee8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.658 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8472bae1-41 in ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.660 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8472bae1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.660 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4c6243-e40e-4bd8-ae0a-30494f77c964]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.661 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce31a6d-f4f4-43d9-8067-498437602899]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 systemd-udevd[290818]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:05:29 np0005588919 systemd-machined[194361]: New machine qemu-80-instance-000000a1.
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.672 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[9df1230a-87bb-42f6-98ee-bcc629fc304a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.6819] device (tap070862f1-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.6826] device (tap070862f1-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.696 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 systemd[1]: Started Virtual Machine qemu-80-instance-000000a1.
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.698 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[72ca53cd-e252-4d1c-9811-82ce42b7b397]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.704 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:29Z|00684|binding|INFO|Setting lport 070862f1-1db2-45c2-9787-752e6d88449a ovn-installed in OVS
Jan 20 10:05:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:29Z|00685|binding|INFO|Setting lport 070862f1-1db2-45c2-9787-752e6d88449a up in Southbound
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.708 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.725 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e4923156-881d-4fba-89b5-827249749980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.731 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5e8b41-690e-4b27-ab5b-b4b7b7f46900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.7326] manager: (tap8472bae1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/287)
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.766 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[99166ce2-c3c3-4e43-a0da-f294c2c2dc02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.769 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[117e2c3b-2b93-4e7d-8d39-235081c6dc83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.7919] device (tap8472bae1-40): carrier: link connected
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.797 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c704badc-1b74-421f-84f0-062b0a4fe7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.814 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[30696ba6-248f-48c2-baa3-fc46c86abe0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8472bae1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:38:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656871, 'reachable_time': 27771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290850, 'error': None, 'target': 'ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.830 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9e5466-f467-49cd-b069-6d1b8a67ceaf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7a:38ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 656871, 'tstamp': 656871}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290851, 'error': None, 'target': 'ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.849 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[47f9244e-f7a5-413a-9ffc-8c61c838f2d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8472bae1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:38:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656871, 'reachable_time': 27771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 290852, 'error': None, 'target': 'ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.879 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d226740-ade9-491c-9157-96e8c47c8489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.931 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[457857c7-aa0e-4e21-9e9c-df5bd3846bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.932 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8472bae1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.933 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.933 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8472bae1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.935 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 NetworkManager[49104]: <info>  [1768921529.9359] manager: (tap8472bae1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 20 10:05:29 np0005588919 kernel: tap8472bae1-40: entered promiscuous mode
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.937 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8472bae1-40, col_values=(('external_ids', {'iface-id': 'a48fbce9-f06f-49f1-8e61-d1d46e8f5808'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.938 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:29Z|00686|binding|INFO|Releasing lport a48fbce9-f06f-49f1-8e61-d1d46e8f5808 from this chassis (sb_readonly=0)
Jan 20 10:05:29 np0005588919 nova_compute[225855]: 2026-01-20 15:05:29.952 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.953 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8472bae1-476b-4100-b9fa-e8827bc4f7bf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8472bae1-476b-4100-b9fa-e8827bc4f7bf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.954 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9e7658-aaa3-49b3-9708-e84448693e5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.954 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-8472bae1-476b-4100-b9fa-e8827bc4f7bf
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/8472bae1-476b-4100-b9fa-e8827bc4f7bf.pid.haproxy
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 8472bae1-476b-4100-b9fa-e8827bc4f7bf
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:05:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:29.955 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'env', 'PROCESS_TAG=haproxy-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8472bae1-476b-4100-b9fa-e8827bc4f7bf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.029 225859 DEBUG nova.network.neutron [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updated VIF entry in instance network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.030 225859 DEBUG nova.network.neutron [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.052 225859 DEBUG oslo_concurrency.lockutils [req-1181a5ab-d1d8-40d2-adc7-b4d6ad28e428 req-723d4bed-c818-47d7-abad-c5f2d60f4592 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:30.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:30 np0005588919 podman[290884]: 2026-01-20 15:05:30.324193477 +0000 UTC m=+0.058278655 container create 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:05:30 np0005588919 systemd[1]: Started libpod-conmon-49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd.scope.
Jan 20 10:05:30 np0005588919 podman[290884]: 2026-01-20 15:05:30.296852391 +0000 UTC m=+0.030937599 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:05:30 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.401 225859 DEBUG nova.compute.manager [req-437f8ecd-1461-496f-969a-8cdeed601b9f req-3ab938b4-c86a-4e3b-8e78-90d342aaf2de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.402 225859 DEBUG oslo_concurrency.lockutils [req-437f8ecd-1461-496f-969a-8cdeed601b9f req-3ab938b4-c86a-4e3b-8e78-90d342aaf2de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.402 225859 DEBUG oslo_concurrency.lockutils [req-437f8ecd-1461-496f-969a-8cdeed601b9f req-3ab938b4-c86a-4e3b-8e78-90d342aaf2de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.403 225859 DEBUG oslo_concurrency.lockutils [req-437f8ecd-1461-496f-969a-8cdeed601b9f req-3ab938b4-c86a-4e3b-8e78-90d342aaf2de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.403 225859 DEBUG nova.compute.manager [req-437f8ecd-1461-496f-969a-8cdeed601b9f req-3ab938b4-c86a-4e3b-8e78-90d342aaf2de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Processing event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:05:30 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5103777890e53183e892eb29a7309526c22c2d478bcad4b3e03ce4ad7bb2fbd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:05:30 np0005588919 podman[290884]: 2026-01-20 15:05:30.418257376 +0000 UTC m=+0.152342584 container init 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:05:30 np0005588919 podman[290884]: 2026-01-20 15:05:30.425007617 +0000 UTC m=+0.159092815 container start 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:05:30 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [NOTICE]   (290921) : New worker (290938) forked
Jan 20 10:05:30 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [NOTICE]   (290921) : Loading success.
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.573 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921530.572652, 33ba7a73-3233-40a3-a49a-e5bbd604dc3c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.574 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.575 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.578 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.581 225859 INFO nova.virt.libvirt.driver [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Instance spawned successfully.#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.581 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.608 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:30.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.614 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.618 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.618 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.619 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.619 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.620 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.620 225859 DEBUG nova.virt.libvirt.driver [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.649 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.652 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921530.5727563, 33ba7a73-3233-40a3-a49a-e5bbd604dc3c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.652 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.732 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.737 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921530.5777285, 33ba7a73-3233-40a3-a49a-e5bbd604dc3c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.737 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.787 225859 INFO nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Took 9.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.788 225859 DEBUG nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.788 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.795 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:05:30 np0005588919 nova_compute[225855]: 2026-01-20 15:05:30.940 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:05:31 np0005588919 nova_compute[225855]: 2026-01-20 15:05:31.008 225859 INFO nova.compute.manager [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Took 10.90 seconds to build instance.#033[00m
Jan 20 10:05:31 np0005588919 nova_compute[225855]: 2026-01-20 15:05:31.037 225859 DEBUG oslo_concurrency.lockutils [None req-81c2281d-dca4-4bae-856a-c49b3b605249 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:32.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:32.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.645 225859 DEBUG nova.compute.manager [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.645 225859 DEBUG oslo_concurrency.lockutils [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.645 225859 DEBUG oslo_concurrency.lockutils [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.646 225859 DEBUG oslo_concurrency.lockutils [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.646 225859 DEBUG nova.compute.manager [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] No waiting events found dispatching network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:05:32 np0005588919 nova_compute[225855]: 2026-01-20 15:05:32.646 225859 WARNING nova.compute.manager [req-5c7b46b8-20f6-47ed-8631-6812815e4af0 req-f55b047e-cac4-4d22-8b66-612bed7c89e5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received unexpected event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a for instance with vm_state active and task_state None.#033[00m
Jan 20 10:05:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:33Z|00687|binding|INFO|Releasing lport a48fbce9-f06f-49f1-8e61-d1d46e8f5808 from this chassis (sb_readonly=0)
Jan 20 10:05:33 np0005588919 nova_compute[225855]: 2026-01-20 15:05:33.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:34.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:34.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.539 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:35 np0005588919 NetworkManager[49104]: <info>  [1768921535.5400] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Jan 20 10:05:35 np0005588919 NetworkManager[49104]: <info>  [1768921535.5409] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.541 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:35Z|00688|binding|INFO|Releasing lport a48fbce9-f06f-49f1-8e61-d1d46e8f5808 from this chassis (sb_readonly=0)
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.561 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.825 225859 DEBUG nova.compute.manager [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-changed-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.826 225859 DEBUG nova.compute.manager [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing instance network info cache due to event network-changed-070862f1-1db2-45c2-9787-752e6d88449a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.826 225859 DEBUG oslo_concurrency.lockutils [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.827 225859 DEBUG oslo_concurrency.lockutils [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:05:35 np0005588919 nova_compute[225855]: 2026-01-20 15:05:35.827 225859 DEBUG nova.network.neutron [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:05:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:36.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:36.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:37 np0005588919 nova_compute[225855]: 2026-01-20 15:05:37.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:37 np0005588919 nova_compute[225855]: 2026-01-20 15:05:37.678 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:38.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:38.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:05:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2436202043' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:05:39 np0005588919 nova_compute[225855]: 2026-01-20 15:05:39.750 225859 DEBUG nova.network.neutron [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updated VIF entry in instance network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:05:39 np0005588919 nova_compute[225855]: 2026-01-20 15:05:39.751 225859 DEBUG nova.network.neutron [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:05:39 np0005588919 nova_compute[225855]: 2026-01-20 15:05:39.788 225859 DEBUG oslo_concurrency.lockutils [req-6b95bc0c-501a-499f-abce-aca5bcfe3379 req-ba08f24e-4478-41d4-9d21-0c2338fe6db1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:05:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:05:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:05:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:40.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:40 np0005588919 nova_compute[225855]: 2026-01-20 15:05:40.341 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:40.342 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:05:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:40.343 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:05:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:40.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:41 np0005588919 podman[291140]: 2026-01-20 15:05:41.020687643 +0000 UTC m=+0.063999727 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:05:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:42.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:42 np0005588919 nova_compute[225855]: 2026-01-20 15:05:42.301 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:05:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:42.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:05:42 np0005588919 nova_compute[225855]: 2026-01-20 15:05:42.682 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:43 np0005588919 nova_compute[225855]: 2026-01-20 15:05:43.097 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:43Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:e7:09 10.100.0.7
Jan 20 10:05:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:43Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:e7:09 10.100.0.7
Jan 20 10:05:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:44.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:44.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:05:45.345 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:05:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:46.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:46 np0005588919 nova_compute[225855]: 2026-01-20 15:05:46.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:46 np0005588919 nova_compute[225855]: 2026-01-20 15:05:46.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:05:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:05:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:46.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:47 np0005588919 nova_compute[225855]: 2026-01-20 15:05:47.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:47 np0005588919 nova_compute[225855]: 2026-01-20 15:05:47.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:47 np0005588919 nova_compute[225855]: 2026-01-20 15:05:47.684 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:48 np0005588919 nova_compute[225855]: 2026-01-20 15:05:48.033 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:48.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:48.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:49 np0005588919 nova_compute[225855]: 2026-01-20 15:05:49.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:49 np0005588919 nova_compute[225855]: 2026-01-20 15:05:49.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:05:49 np0005588919 nova_compute[225855]: 2026-01-20 15:05:49.372 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:05:49 np0005588919 nova_compute[225855]: 2026-01-20 15:05:49.372 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:50.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:50 np0005588919 nova_compute[225855]: 2026-01-20 15:05:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:50.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:51 np0005588919 nova_compute[225855]: 2026-01-20 15:05:51.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:51 np0005588919 nova_compute[225855]: 2026-01-20 15:05:51.832 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:51 np0005588919 nova_compute[225855]: 2026-01-20 15:05:51.832 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:51 np0005588919 nova_compute[225855]: 2026-01-20 15:05:51.855 225859 DEBUG nova.objects.instance [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'flavor' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:51 np0005588919 nova_compute[225855]: 2026-01-20 15:05:51.897 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.115 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.115 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.116 225859 INFO nova.compute.manager [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Attaching volume 4a621494-2aaf-461c-b7c1-05665913aaf9 to /dev/vdb#033[00m
Jan 20 10:05:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:52.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.288 225859 DEBUG os_brick.utils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.289 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.301 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.302 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[9aabd57a-3196-4d2a-946b-8c724c961506]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.304 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.334 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.335 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[8b253404-f9b0-4167-9514-cec9ad1c659d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.335 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.338 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.347 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.347 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[8380cd7b-ae81-4fea-a892-361384ce5d82]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.348 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[234b23b9-b0a0-4313-b91b-7dfbe6e4b9b0]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.349 225859 DEBUG oslo_concurrency.processutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.381 225859 DEBUG oslo_concurrency.processutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.383 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.383 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.383 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.384 225859 DEBUG os_brick.utils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] <== get_connector_properties: return (95ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.384 225859 DEBUG nova.virt.block_device [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating existing volume attachment record: c023db9a-8197-4b91-bb70-8b3b1d00be58 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:05:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:52.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:52 np0005588919 nova_compute[225855]: 2026-01-20 15:05:52.685 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.325 225859 DEBUG nova.objects.instance [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'flavor' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.381 225859 DEBUG nova.virt.libvirt.driver [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Attempting to attach volume 4a621494-2aaf-461c-b7c1-05665913aaf9 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.384 225859 DEBUG nova.virt.libvirt.guest [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-4a621494-2aaf-461c-b7c1-05665913aaf9">
Jan 20 10:05:53 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 10:05:53 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  </auth>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:53 np0005588919 nova_compute[225855]:  <serial>4a621494-2aaf-461c-b7c1-05665913aaf9</serial>
Jan 20 10:05:53 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:05:53 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:05:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:05:53Z|00689|binding|INFO|Releasing lport a48fbce9-f06f-49f1-8e61-d1d46e8f5808 from this chassis (sb_readonly=0)
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.542 225859 DEBUG nova.virt.libvirt.driver [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.543 225859 DEBUG nova.virt.libvirt.driver [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.544 225859 DEBUG nova.virt.libvirt.driver [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.544 225859 DEBUG nova.virt.libvirt.driver [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No VIF found with MAC fa:16:3e:e5:e7:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:53 np0005588919 nova_compute[225855]: 2026-01-20 15:05:53.767 225859 DEBUG oslo_concurrency.lockutils [None req-bdcb178d-591d-435f-ba7a-6af358df18ba bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:54.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.388 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.390 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.390 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:05:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:54.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:05:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2670266435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.833 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.912 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.912 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:05:54 np0005588919 nova_compute[225855]: 2026-01-20 15:05:54.912 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.054 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.055 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4132MB free_disk=20.942607879638672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.055 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.056 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.128 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.128 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.128 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.333 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:05:55 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:05:55 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3251268508' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.765 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.771 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.787 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.812 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:05:55 np0005588919 nova_compute[225855]: 2026-01-20 15:05:55.813 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:56.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.570 225859 DEBUG oslo_concurrency.lockutils [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.571 225859 DEBUG oslo_concurrency.lockutils [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.605 225859 INFO nova.compute.manager [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Detaching volume 4a621494-2aaf-461c-b7c1-05665913aaf9#033[00m
Jan 20 10:05:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:56.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.876 225859 INFO nova.virt.block_device [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Attempting to driver detach volume 4a621494-2aaf-461c-b7c1-05665913aaf9 from mountpoint /dev/vdb#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.884 225859 DEBUG nova.virt.libvirt.driver [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Attempting to detach device vdb from instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.885 225859 DEBUG nova.virt.libvirt.guest [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-4a621494-2aaf-461c-b7c1-05665913aaf9">
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <serial>4a621494-2aaf-461c-b7c1-05665913aaf9</serial>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:05:56 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.932 225859 INFO nova.virt.libvirt.driver [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Successfully detached device vdb from instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c from the persistent domain config.#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.933 225859 DEBUG nova.virt.libvirt.driver [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:05:56 np0005588919 nova_compute[225855]: 2026-01-20 15:05:56.933 225859 DEBUG nova.virt.libvirt.guest [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-4a621494-2aaf-461c-b7c1-05665913aaf9">
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <serial>4a621494-2aaf-461c-b7c1-05665913aaf9</serial>
Jan 20 10:05:56 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:05:56 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:05:56 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.031 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921557.0311704, 33ba7a73-3233-40a3-a49a-e5bbd604dc3c => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.033 225859 DEBUG nova.virt.libvirt.driver [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.036 225859 INFO nova.virt.libvirt.driver [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Successfully detached device vdb from instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c from the live domain config.#033[00m
Jan 20 10:05:57 np0005588919 podman[291317]: 2026-01-20 15:05:57.281414856 +0000 UTC m=+0.082890063 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.332 225859 DEBUG nova.objects.instance [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'flavor' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.337 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.393 225859 DEBUG oslo_concurrency.lockutils [None req-1e86e0aa-03c9-4e89-9521-3f8705123b1e bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:05:57 np0005588919 nova_compute[225855]: 2026-01-20 15:05:57.687 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:05:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:05:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:05:58.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:05:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:05:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:05:58.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:05:58 np0005588919 nova_compute[225855]: 2026-01-20 15:05:58.808 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:59 np0005588919 nova_compute[225855]: 2026-01-20 15:05:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:05:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 20 10:06:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:00.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 20 10:06:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:00.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:01 np0005588919 nova_compute[225855]: 2026-01-20 15:06:01.431 225859 DEBUG nova.compute.manager [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:01 np0005588919 nova_compute[225855]: 2026-01-20 15:06:01.477 225859 INFO nova.compute.manager [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] instance snapshotting#033[00m
Jan 20 10:06:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 20 10:06:01 np0005588919 nova_compute[225855]: 2026-01-20 15:06:01.944 225859 INFO nova.virt.libvirt.driver [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Beginning live snapshot process#033[00m
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.105 225859 DEBUG nova.virt.libvirt.imagebackend [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] No parent info for a32b3e07-16d8-46fd-9a7b-c242c432fcf9; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 20 10:06:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:02.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.339 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.398 225859 DEBUG nova.storage.rbd_utils [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] creating snapshot(b1129a7f63564f43b029059be651efa0) on rbd image(33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:06:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.634 225859 DEBUG nova.storage.rbd_utils [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] cloning vms/33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk@b1129a7f63564f43b029059be651efa0 to images/8c970c65-2888-4da3-891e-c2b6eb3ea735 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 20 10:06:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:02.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.729 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:02 np0005588919 nova_compute[225855]: 2026-01-20 15:06:02.810 225859 DEBUG nova.storage.rbd_utils [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] flattening images/8c970c65-2888-4da3-891e-c2b6eb3ea735 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 20 10:06:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:03 np0005588919 nova_compute[225855]: 2026-01-20 15:06:03.221 225859 DEBUG nova.storage.rbd_utils [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] removing snapshot(b1129a7f63564f43b029059be651efa0) on rbd image(33ba7a73-3233-40a3-a49a-e5bbd604dc3c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:06:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 20 10:06:03 np0005588919 nova_compute[225855]: 2026-01-20 15:06:03.648 225859 DEBUG nova.storage.rbd_utils [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] creating snapshot(snap) on rbd image(8c970c65-2888-4da3-891e-c2b6eb3ea735) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:06:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:04.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 20 10:06:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:06.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:06.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:06 np0005588919 nova_compute[225855]: 2026-01-20 15:06:06.864 225859 INFO nova.virt.libvirt.driver [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Snapshot image upload complete#033[00m
Jan 20 10:06:06 np0005588919 nova_compute[225855]: 2026-01-20 15:06:06.865 225859 INFO nova.compute.manager [None req-e51df534-7bef-4f9c-bfbf-ab53b38c3363 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Took 5.39 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 20 10:06:07 np0005588919 nova_compute[225855]: 2026-01-20 15:06:07.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:07 np0005588919 nova_compute[225855]: 2026-01-20 15:06:07.731 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:08.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:08.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:10.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:10.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 20 10:06:12 np0005588919 podman[291518]: 2026-01-20 15:06:12.002581667 +0000 UTC m=+0.049942858 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 10:06:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:12.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:12 np0005588919 nova_compute[225855]: 2026-01-20 15:06:12.376 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:12.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:12 np0005588919 nova_compute[225855]: 2026-01-20 15:06:12.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:14.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:14.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:16.425 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:16.426 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:16.426 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:16.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:17 np0005588919 nova_compute[225855]: 2026-01-20 15:06:17.379 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588919 nova_compute[225855]: 2026-01-20 15:06:17.735 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:17.930 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:17.931 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:06:17 np0005588919 nova_compute[225855]: 2026-01-20 15:06:17.933 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:18.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.142 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.156 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.157 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.157 225859 INFO nova.compute.manager [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Unshelving#033[00m
Jan 20 10:06:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:20.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.254 225859 INFO nova.virt.block_device [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Booting with volume 94300d81-b4ca-4c0a-9283-83b76826d40f at /dev/vda#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.658 225859 DEBUG os_brick.utils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.659 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:20.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.670 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.670 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[5155bd37-dd5a-4f59-8e0d-5df2a95f4d0a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.671 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.679 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.680 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab853a2-d59b-48b6-9d30-2dbea2c17bc2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.681 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.690 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.691 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[49363ee3-5e53-44a6-a321-c3616c388a7e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.692 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[a97b22ef-cb54-46c1-8e9b-f12df7e90550]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.693 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.732 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "nvme version" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.734 225859 DEBUG os_brick.initiator.connectors.lightos [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.735 225859 DEBUG os_brick.initiator.connectors.lightos [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.735 225859 DEBUG os_brick.initiator.connectors.lightos [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.735 225859 DEBUG os_brick.utils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] <== get_connector_properties: return (77ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:06:20 np0005588919 nova_compute[225855]: 2026-01-20 15:06:20.736 225859 DEBUG nova.virt.block_device [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating existing volume attachment record: 50d9296f-3ad8-43e9-a963-2b942f9bc3e3 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.869242) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580869314, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1764, "num_deletes": 265, "total_data_size": 3693740, "memory_usage": 3754728, "flush_reason": "Manual Compaction"}
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580883990, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2422490, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58144, "largest_seqno": 59903, "table_properties": {"data_size": 2415041, "index_size": 4327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16450, "raw_average_key_size": 20, "raw_value_size": 2399765, "raw_average_value_size": 2999, "num_data_blocks": 189, "num_entries": 800, "num_filter_entries": 800, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921455, "oldest_key_time": 1768921455, "file_creation_time": 1768921580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 14804 microseconds, and 5955 cpu microseconds.
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.884041) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2422490 bytes OK
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.884065) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.891384) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.891409) EVENT_LOG_v1 {"time_micros": 1768921580891402, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.891430) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3685531, prev total WAL file size 3685531, number of live WAL files 2.
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.892454) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303137' seq:72057594037927935, type:22 .. '6C6F676D0032323731' seq:0, type:0; will stop at (end)
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2365KB)], [114(10071KB)]
Jan 20 10:06:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921580892529, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 12735937, "oldest_snapshot_seqno": -1}
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8596 keys, 12588995 bytes, temperature: kUnknown
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581038942, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 12588995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12531223, "index_size": 35196, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21509, "raw_key_size": 222783, "raw_average_key_size": 25, "raw_value_size": 12377915, "raw_average_value_size": 1439, "num_data_blocks": 1380, "num_entries": 8596, "num_filter_entries": 8596, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.039230) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 12588995 bytes
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.040552) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 86.9 rd, 85.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 9.8 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 9140, records dropped: 544 output_compression: NoCompression
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.040568) EVENT_LOG_v1 {"time_micros": 1768921581040560, "job": 72, "event": "compaction_finished", "compaction_time_micros": 146475, "compaction_time_cpu_micros": 35056, "output_level": 6, "num_output_files": 1, "total_output_size": 12588995, "num_input_records": 9140, "num_output_records": 8596, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581041029, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921581043057, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:20.892284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.043145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.043151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.043153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.043155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:06:21.043157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.799 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.801 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.805 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'pci_requests' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.825 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'numa_topology' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.840 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.840 225859 INFO nova.compute.claims [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:06:21 np0005588919 nova_compute[225855]: 2026-01-20 15:06:21.967 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:22.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3242676901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.424 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.439 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.447 225859 DEBUG nova.compute.provider_tree [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.462 225859 DEBUG nova.scheduler.client.report [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.486 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:22.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:22 np0005588919 nova_compute[225855]: 2026-01-20 15:06:22.808 225859 INFO nova.network.neutron [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating port 70668adb-f9ad-41cb-8eac-2e0aba32bf22 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:06:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:23.933 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:24.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.550 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.660 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.661 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquired lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.661 225859 DEBUG nova.network.neutron [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:06:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:24.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.975 225859 DEBUG nova.compute.manager [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-changed-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.975 225859 DEBUG nova.compute.manager [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Refreshing instance network info cache due to event network-changed-70668adb-f9ad-41cb-8eac-2e0aba32bf22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:06:24 np0005588919 nova_compute[225855]: 2026-01-20 15:06:24.975 225859 DEBUG oslo_concurrency.lockutils [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:26.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:26.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:27 np0005588919 nova_compute[225855]: 2026-01-20 15:06:27.425 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:27 np0005588919 nova_compute[225855]: 2026-01-20 15:06:27.739 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:28 np0005588919 podman[291626]: 2026-01-20 15:06:28.042970988 +0000 UTC m=+0.082602245 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:06:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:28.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.251 225859 DEBUG nova.network.neutron [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating instance_info_cache with network_info: [{"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.272 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Releasing lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.274 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.275 225859 INFO nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Creating image(s)#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.275 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.275 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Ensure instance console log exists: /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.276 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.276 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.276 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.279 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Start _get_guest_xml network_info=[{"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-94300d81-b4ca-4c0a-9283-83b76826d40f', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '94300d81-b4ca-4c0a-9283-83b76826d40f', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '5feeb9de-434b-4ec7-aa99-6da718514c6f', 'attached_at': '', 'detached_at': '', 'volume_id': '94300d81-b4ca-4c0a-9283-83b76826d40f', 'serial': '94300d81-b4ca-4c0a-9283-83b76826d40f'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '50d9296f-3ad8-43e9-a963-2b942f9bc3e3', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.280 225859 DEBUG oslo_concurrency.lockutils [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.280 225859 DEBUG nova.network.neutron [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Refreshing network info cache for port 70668adb-f9ad-41cb-8eac-2e0aba32bf22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.287 225859 WARNING nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.294 225859 DEBUG nova.virt.libvirt.host [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.295 225859 DEBUG nova.virt.libvirt.host [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.303 225859 DEBUG nova.virt.libvirt.host [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.303 225859 DEBUG nova.virt.libvirt.host [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.305 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.305 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.305 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.306 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.307 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.307 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.307 225859 DEBUG nova.virt.hardware [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.307 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.355 225859 DEBUG nova.storage.rbd_utils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image 5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.362 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:28.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2756618266' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.857 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.886 225859 DEBUG nova.virt.libvirt.vif [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:05:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-670486896',display_name='tempest-TestShelveInstance-server-670486896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-670486896',id=162,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1862119958',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:05:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18qxw31n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:06:20Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=5feeb9de-434b-4ec7-aa99-6da718514c6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.887 225859 DEBUG nova.network.os_vif_util [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.888 225859 DEBUG nova.network.os_vif_util [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.889 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'pci_devices' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.907 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <uuid>5feeb9de-434b-4ec7-aa99-6da718514c6f</uuid>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <name>instance-000000a2</name>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestShelveInstance-server-670486896</nova:name>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:06:28</nova:creationTime>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:user uuid="b02a8ef6cc3946ceb2c8846aae2eae68">tempest-TestShelveInstance-1425544575-project-member</nova:user>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:project uuid="0fc924d2df984301897e81920c5e192f">tempest-TestShelveInstance-1425544575</nova:project>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <nova:port uuid="70668adb-f9ad-41cb-8eac-2e0aba32bf22">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="serial">5feeb9de-434b-4ec7-aa99-6da718514c6f</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="uuid">5feeb9de-434b-4ec7-aa99-6da718514c6f</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-94300d81-b4ca-4c0a-9283-83b76826d40f">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <serial>94300d81-b4ca-4c0a-9283-83b76826d40f</serial>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:6a:c0:d3"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <target dev="tap70668adb-f9"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/console.log" append="off"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:06:28 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:06:28 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:06:28 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:06:28 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.908 225859 DEBUG nova.compute.manager [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Preparing to wait for external event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.908 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.908 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.908 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.909 225859 DEBUG nova.virt.libvirt.vif [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:05:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-670486896',display_name='tempest-TestShelveInstance-server-670486896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-670486896',id=162,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1862119958',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:05:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18qxw31n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:06:20Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=5feeb9de-434b-4ec7-aa99-6da718514c6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.909 225859 DEBUG nova.network.os_vif_util [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.910 225859 DEBUG nova.network.os_vif_util [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.910 225859 DEBUG os_vif [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.911 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.911 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.911 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.914 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.915 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70668adb-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.915 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70668adb-f9, col_values=(('external_ids', {'iface-id': '70668adb-f9ad-41cb-8eac-2e0aba32bf22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:c0:d3', 'vm-uuid': '5feeb9de-434b-4ec7-aa99-6da718514c6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.916 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:28 np0005588919 NetworkManager[49104]: <info>  [1768921588.9176] manager: (tap70668adb-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.918 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.925 225859 INFO os_vif [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9')#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.972 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.972 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.972 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] No VIF found with MAC fa:16:3e:6a:c0:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.973 225859 INFO nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Using config drive#033[00m
Jan 20 10:06:28 np0005588919 nova_compute[225855]: 2026-01-20 15:06:28.996 225859 DEBUG nova.storage.rbd_utils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image 5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:29 np0005588919 nova_compute[225855]: 2026-01-20 15:06:29.013 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:29 np0005588919 nova_compute[225855]: 2026-01-20 15:06:29.050 225859 DEBUG nova.objects.instance [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'keypairs' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.192 225859 INFO nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Creating config drive at /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.203 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63nupge2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:30.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.346 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63nupge2" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.379 225859 DEBUG nova.storage.rbd_utils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] rbd image 5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.383 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config 5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.526 225859 DEBUG oslo_concurrency.processutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config 5feeb9de-434b-4ec7-aa99-6da718514c6f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.526 225859 INFO nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Deleting local config drive /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f/disk.config because it was imported into RBD.#033[00m
Jan 20 10:06:30 np0005588919 kernel: tap70668adb-f9: entered promiscuous mode
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.5857] manager: (tap70668adb-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Jan 20 10:06:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:30Z|00690|binding|INFO|Claiming lport 70668adb-f9ad-41cb-8eac-2e0aba32bf22 for this chassis.
Jan 20 10:06:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:30Z|00691|binding|INFO|70668adb-f9ad-41cb-8eac-2e0aba32bf22: Claiming fa:16:3e:6a:c0:d3 10.100.0.3
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.587 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.596 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c0:d3 10.100.0.3'], port_security=['fa:16:3e:6a:c0:d3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5feeb9de-434b-4ec7-aa99-6da718514c6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'd8d958e0-892e-4275-9633-96783d5a96b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=70668adb-f9ad-41cb-8eac-2e0aba32bf22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.597 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 70668adb-f9ad-41cb-8eac-2e0aba32bf22 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 bound to our chassis#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.599 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f434e83-45c8-454d-820b-af39b696a1d5#033[00m
Jan 20 10:06:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:30Z|00692|binding|INFO|Setting lport 70668adb-f9ad-41cb-8eac-2e0aba32bf22 ovn-installed in OVS
Jan 20 10:06:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:30Z|00693|binding|INFO|Setting lport 70668adb-f9ad-41cb-8eac-2e0aba32bf22 up in Southbound
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.608 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.610 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.612 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b92cea63-8afa-420b-802c-158c259a139f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.614 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f434e83-41 in ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.617 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f434e83-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.617 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e54bf5-e1a3-4a6b-842a-f2d758d910ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 systemd-udevd[291768]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.618 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1250e6ed-b352-4d77-8735-22a05cd68966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 systemd-machined[194361]: New machine qemu-81-instance-000000a2.
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.632 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[82ad6d2c-6b45-41ef-a0f4-b7db00217865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.6345] device (tap70668adb-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.6352] device (tap70668adb-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:06:30 np0005588919 systemd[1]: Started Virtual Machine qemu-81-instance-000000a2.
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.655 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[65c806d5-fa64-4ba0-b944-44c2db6208c3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:30.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.682 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cc42d5-1ac5-451d-a521-40e340b4bd52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.686 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[77bc25be-847d-4756-b206-a2b2bee1aa4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.6881] manager: (tap0f434e83-40): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.718 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d47d7515-3be9-4e87-abba-71ab833036b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.721 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[254e7f0d-9a14-4f2c-a52d-ebaa928e7b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.7440] device (tap0f434e83-40): carrier: link connected
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.748 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[50d0c082-05c2-42e6-8a2b-fcfa7cb015e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.765 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[336af903-3b21-49d1-bf73-ca2345c02dd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662967, 'reachable_time': 24643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291801, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.778 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dac8bb-b0d0-4cc5-928b-d50c2a675733]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:128d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662967, 'tstamp': 662967}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291802, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.791 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42edd7c2-330b-4629-ba2e-fbc9966ab0f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f434e83-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:12:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662967, 'reachable_time': 24643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 291803, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.817 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c4399698-cea6-4eef-95e6-0e8ca37b1e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.867 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[087e4bfe-e2b1-444b-ac91-97b540550198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.869 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.869 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.870 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f434e83-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:30 np0005588919 NetworkManager[49104]: <info>  [1768921590.9165] manager: (tap0f434e83-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 20 10:06:30 np0005588919 kernel: tap0f434e83-40: entered promiscuous mode
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.920 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f434e83-40, col_values=(('external_ids', {'iface-id': '6133323e-bf50-4bbd-bc0b-9ecf135d8cd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.921 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:30Z|00694|binding|INFO|Releasing lport 6133323e-bf50-4bbd-bc0b-9ecf135d8cd5 from this chassis (sb_readonly=0)
Jan 20 10:06:30 np0005588919 nova_compute[225855]: 2026-01-20 15:06:30.936 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.938 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.938 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3ba259-87ca-4fa9-82ba-ab49fd709f95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.939 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/0f434e83-45c8-454d-820b-af39b696a1d5.pid.haproxy
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 0f434e83-45c8-454d-820b-af39b696a1d5
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:06:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:30.940 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'env', 'PROCESS_TAG=haproxy-0f434e83-45c8-454d-820b-af39b696a1d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f434e83-45c8-454d-820b-af39b696a1d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.200 225859 DEBUG nova.compute.manager [req-42b443c1-3a98-4913-9854-23990427110c req-905f3f03-50ac-4e62-b92f-4be760e1549b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.201 225859 DEBUG oslo_concurrency.lockutils [req-42b443c1-3a98-4913-9854-23990427110c req-905f3f03-50ac-4e62-b92f-4be760e1549b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.202 225859 DEBUG oslo_concurrency.lockutils [req-42b443c1-3a98-4913-9854-23990427110c req-905f3f03-50ac-4e62-b92f-4be760e1549b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.203 225859 DEBUG oslo_concurrency.lockutils [req-42b443c1-3a98-4913-9854-23990427110c req-905f3f03-50ac-4e62-b92f-4be760e1549b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.203 225859 DEBUG nova.compute.manager [req-42b443c1-3a98-4913-9854-23990427110c req-905f3f03-50ac-4e62-b92f-4be760e1549b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Processing event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.204 225859 DEBUG nova.network.neutron [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updated VIF entry in instance network info cache for port 70668adb-f9ad-41cb-8eac-2e0aba32bf22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.205 225859 DEBUG nova.network.neutron [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating instance_info_cache with network_info: [{"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.220 225859 DEBUG oslo_concurrency.lockutils [req-b0baf1dc-9f24-4299-8c90-d5a44fd6c42a req-49edfd08-24e1-411d-babc-f90e50b80d9e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.223 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921591.2233129, 5feeb9de-434b-4ec7-aa99-6da718514c6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.224 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] VM Started (Lifecycle Event)#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.226 225859 DEBUG nova.compute.manager [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.229 225859 DEBUG nova.virt.libvirt.driver [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.232 225859 INFO nova.virt.libvirt.driver [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Instance spawned successfully.#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.232 225859 DEBUG nova.compute.manager [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.279 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.282 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:31 np0005588919 podman[291877]: 2026-01-20 15:06:31.327006977 +0000 UTC m=+0.053841448 container create 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.336 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.337 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921591.2241447, 5feeb9de-434b-4ec7-aa99-6da718514c6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.337 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.354 225859 DEBUG oslo_concurrency.lockutils [None req-ae7ef1c3-3d4a-4f00-a894-01c19671aa6e b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:31 np0005588919 systemd[1]: Started libpod-conmon-7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa.scope.
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.373 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.377 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921591.2283845, 5feeb9de-434b-4ec7-aa99-6da718514c6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.378 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:06:31 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:06:31 np0005588919 podman[291877]: 2026-01-20 15:06:31.299749234 +0000 UTC m=+0.026583745 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:06:31 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2da085baf4d307a9cd37776373e9378476598690798d692b40336094440304d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.399 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.402 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:31 np0005588919 podman[291877]: 2026-01-20 15:06:31.409943561 +0000 UTC m=+0.136778052 container init 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 10:06:31 np0005588919 podman[291877]: 2026-01-20 15:06:31.414896171 +0000 UTC m=+0.141730642 container start 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 20 10:06:31 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [NOTICE]   (291896) : New worker (291898) forked
Jan 20 10:06:31 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [NOTICE]   (291896) : Loading success.
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.461 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.461 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.477 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.548 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.549 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.555 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.556 225859 INFO nova.compute.claims [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:06:31 np0005588919 nova_compute[225855]: 2026-01-20 15:06:31.711 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3538617676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:32.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.229 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.235 225859 DEBUG nova.compute.provider_tree [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.254 225859 DEBUG nova.scheduler.client.report [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.278 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.279 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.331 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.332 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.368 225859 INFO nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.384 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.429 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.485 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.486 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.487 225859 INFO nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Creating image(s)#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.509 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.535 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.560 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.563 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.598 225859 DEBUG nova.policy [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2beb3d6247e457abd6e8d93cc602f02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5e161d5a47f845fd89eb3f10627a0830', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.643 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.644 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.645 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.645 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.672 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:32.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.677 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:32 np0005588919 nova_compute[225855]: 2026-01-20 15:06:32.972 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.295s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.051 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] resizing rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:06:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.148 225859 DEBUG nova.objects.instance [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.165 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.166 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Ensure instance console log exists: /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.166 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.167 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.167 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.326 225859 DEBUG nova.compute.manager [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.327 225859 DEBUG oslo_concurrency.lockutils [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.327 225859 DEBUG oslo_concurrency.lockutils [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.327 225859 DEBUG oslo_concurrency.lockutils [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.327 225859 DEBUG nova.compute.manager [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] No waiting events found dispatching network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.327 225859 WARNING nova.compute.manager [req-dbb32488-35f0-46ea-9e59-ba9c5923d0c7 req-1a7a350b-dc89-4eea-b390-f3a59cd6e5ea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received unexpected event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.334 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Successfully created port: 6e7af943-7ef0-441d-a402-bd595082f98e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:06:33 np0005588919 nova_compute[225855]: 2026-01-20 15:06:33.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:34.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:34.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:34 np0005588919 nova_compute[225855]: 2026-01-20 15:06:34.977 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Successfully updated port: 6e7af943-7ef0-441d-a402-bd595082f98e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:06:34 np0005588919 nova_compute[225855]: 2026-01-20 15:06:34.997 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:34 np0005588919 nova_compute[225855]: 2026-01-20 15:06:34.997 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquired lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:34 np0005588919 nova_compute[225855]: 2026-01-20 15:06:34.997 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:06:35 np0005588919 nova_compute[225855]: 2026-01-20 15:06:35.092 225859 DEBUG nova.compute.manager [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-changed-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:35 np0005588919 nova_compute[225855]: 2026-01-20 15:06:35.093 225859 DEBUG nova.compute.manager [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Refreshing instance network info cache due to event network-changed-6e7af943-7ef0-441d-a402-bd595082f98e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:06:35 np0005588919 nova_compute[225855]: 2026-01-20 15:06:35.093 225859 DEBUG oslo_concurrency.lockutils [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:35 np0005588919 nova_compute[225855]: 2026-01-20 15:06:35.207 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:06:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:36.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:36.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.974 225859 DEBUG nova.network.neutron [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updating instance_info_cache with network_info: [{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.994 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Releasing lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.995 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance network_info: |[{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.995 225859 DEBUG oslo_concurrency.lockutils [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.995 225859 DEBUG nova.network.neutron [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Refreshing network info cache for port 6e7af943-7ef0-441d-a402-bd595082f98e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:06:36 np0005588919 nova_compute[225855]: 2026-01-20 15:06:36.998 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start _get_guest_xml network_info=[{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.295 225859 WARNING nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.300 225859 DEBUG nova.virt.libvirt.host [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.301 225859 DEBUG nova.virt.libvirt.host [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.303 225859 DEBUG nova.virt.libvirt.host [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.304 225859 DEBUG nova.virt.libvirt.host [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.305 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.305 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.305 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.306 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.306 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.306 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.306 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.307 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.307 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.307 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.307 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.307 225859 DEBUG nova.virt.hardware [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.310 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.430 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3922319981' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.790 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.830 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:37 np0005588919 nova_compute[225855]: 2026-01-20 15:06:37.834 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.198 225859 DEBUG nova.network.neutron [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updated VIF entry in instance network info cache for port 6e7af943-7ef0-441d-a402-bd595082f98e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.199 225859 DEBUG nova.network.neutron [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updating instance_info_cache with network_info: [{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.216 225859 DEBUG oslo_concurrency.lockutils [req-5ece68d1-de02-4889-9091-0f350535054a req-cbcc52ea-193a-4403-9426-b15d13d5da97 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:38.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:06:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171499437' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.266 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.267 225859 DEBUG nova.virt.libvirt.vif [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:06:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1315322326',display_name='tempest-ServerRescueTestJSON-server-1315322326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1315322326',id=165,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-g1s28307',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:06:32Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=5f8a2718-2106-431c-82c1-2609a52e7fb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.268 225859 DEBUG nova.network.os_vif_util [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.269 225859 DEBUG nova.network.os_vif_util [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.270 225859 DEBUG nova.objects.instance [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.285 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <uuid>5f8a2718-2106-431c-82c1-2609a52e7fb2</uuid>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <name>instance-000000a5</name>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueTestJSON-server-1315322326</nova:name>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:06:37</nova:creationTime>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:user uuid="a2beb3d6247e457abd6e8d93cc602f02">tempest-ServerRescueTestJSON-1151598672-project-member</nova:user>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:project uuid="5e161d5a47f845fd89eb3f10627a0830">tempest-ServerRescueTestJSON-1151598672</nova:project>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <nova:port uuid="6e7af943-7ef0-441d-a402-bd595082f98e">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="serial">5f8a2718-2106-431c-82c1-2609a52e7fb2</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="uuid">5f8a2718-2106-431c-82c1-2609a52e7fb2</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f8a2718-2106-431c-82c1-2609a52e7fb2_disk">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:21:8b:e2"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <target dev="tap6e7af943-7e"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/console.log" append="off"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:06:38 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:06:38 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:06:38 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:06:38 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.291 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Preparing to wait for external event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.292 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.292 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.292 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.293 225859 DEBUG nova.virt.libvirt.vif [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:06:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1315322326',display_name='tempest-ServerRescueTestJSON-server-1315322326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1315322326',id=165,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-g1s28307',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:06:32Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=5f8a2718-2106-431c-82c1-2609a52e7fb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.293 225859 DEBUG nova.network.os_vif_util [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.294 225859 DEBUG nova.network.os_vif_util [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.295 225859 DEBUG os_vif [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.296 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.297 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.299 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e7af943-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.300 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e7af943-7e, col_values=(('external_ids', {'iface-id': '6e7af943-7ef0-441d-a402-bd595082f98e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:8b:e2', 'vm-uuid': '5f8a2718-2106-431c-82c1-2609a52e7fb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.301 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:38 np0005588919 NetworkManager[49104]: <info>  [1768921598.3021] manager: (tap6e7af943-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.309 225859 INFO os_vif [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e')#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.390 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.390 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.391 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No VIF found with MAC fa:16:3e:21:8b:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.391 225859 INFO nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Using config drive#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.418 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:38.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.890 225859 INFO nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Creating config drive at /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config#033[00m
Jan 20 10:06:38 np0005588919 nova_compute[225855]: 2026-01-20 15:06:38.896 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6dfgh5ii execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.028 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6dfgh5ii" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.058 225859 DEBUG nova.storage.rbd_utils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.061 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.222 225859 DEBUG oslo_concurrency.processutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.223 225859 INFO nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Deleting local config drive /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config because it was imported into RBD.#033[00m
Jan 20 10:06:39 np0005588919 NetworkManager[49104]: <info>  [1768921599.2700] manager: (tap6e7af943-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Jan 20 10:06:39 np0005588919 kernel: tap6e7af943-7e: entered promiscuous mode
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:39 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:39Z|00695|binding|INFO|Claiming lport 6e7af943-7ef0-441d-a402-bd595082f98e for this chassis.
Jan 20 10:06:39 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:39Z|00696|binding|INFO|6e7af943-7ef0-441d-a402-bd595082f98e: Claiming fa:16:3e:21:8b:e2 10.100.0.13
Jan 20 10:06:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:39.291 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:06:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:39.294 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 bound to our chassis#033[00m
Jan 20 10:06:39 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:39Z|00697|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e ovn-installed in OVS
Jan 20 10:06:39 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:39Z|00698|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e up in Southbound
Jan 20 10:06:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:39.297 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:06:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:06:39.298 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[68f66ed2-e82b-4f40-aa74-7058e851b5d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:39 np0005588919 systemd-udevd[292284]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:06:39 np0005588919 systemd-machined[194361]: New machine qemu-82-instance-000000a5.
Jan 20 10:06:39 np0005588919 NetworkManager[49104]: <info>  [1768921599.3228] device (tap6e7af943-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:06:39 np0005588919 NetworkManager[49104]: <info>  [1768921599.3237] device (tap6e7af943-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:06:39 np0005588919 systemd[1]: Started Virtual Machine qemu-82-instance-000000a5.
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.884 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921599.88368, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.885 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Started (Lifecycle Event)#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.906 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.909 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921599.8844101, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.909 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.929 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.932 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:39 np0005588919 nova_compute[225855]: 2026-01-20 15:06:39.965 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:06:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:40.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.820 225859 DEBUG nova.compute.manager [req-0eb15549-94bb-4129-9232-93f13bc5e44f req-c256054e-d834-4bba-b003-3b8fa5cbe07f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.820 225859 DEBUG oslo_concurrency.lockutils [req-0eb15549-94bb-4129-9232-93f13bc5e44f req-c256054e-d834-4bba-b003-3b8fa5cbe07f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.821 225859 DEBUG oslo_concurrency.lockutils [req-0eb15549-94bb-4129-9232-93f13bc5e44f req-c256054e-d834-4bba-b003-3b8fa5cbe07f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.821 225859 DEBUG oslo_concurrency.lockutils [req-0eb15549-94bb-4129-9232-93f13bc5e44f req-c256054e-d834-4bba-b003-3b8fa5cbe07f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.821 225859 DEBUG nova.compute.manager [req-0eb15549-94bb-4129-9232-93f13bc5e44f req-c256054e-d834-4bba-b003-3b8fa5cbe07f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Processing event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.822 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.825 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921600.8251643, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.825 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.827 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.831 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance spawned successfully.#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.832 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.849 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.854 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.858 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.858 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.858 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.859 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.859 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.860 225859 DEBUG nova.virt.libvirt.driver [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.886 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.913 225859 INFO nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Took 8.43 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.913 225859 DEBUG nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.968 225859 INFO nova.compute.manager [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Took 9.44 seconds to build instance.#033[00m
Jan 20 10:06:40 np0005588919 nova_compute[225855]: 2026-01-20 15:06:40.981 225859 DEBUG oslo_concurrency.lockutils [None req-3a9b1e1c-62ab-4a29-8bb2-7f8e6748dcd8 a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:42.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.433 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.639 225859 INFO nova.compute.manager [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Rescuing#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.640 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.641 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquired lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.641 225859 DEBUG nova.network.neutron [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:06:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:42.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.948 225859 DEBUG nova.compute.manager [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.949 225859 DEBUG oslo_concurrency.lockutils [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.949 225859 DEBUG oslo_concurrency.lockutils [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.950 225859 DEBUG oslo_concurrency.lockutils [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.950 225859 DEBUG nova.compute.manager [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:06:42 np0005588919 nova_compute[225855]: 2026-01-20 15:06:42.951 225859 WARNING nova.compute.manager [req-7f9cb29a-f1bc-4b20-8202-2125b4da7b69 req-a6170d29-7c3f-419a-97a0-81cf695a9ee9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:06:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:43 np0005588919 podman[292337]: 2026-01-20 15:06:43.098114085 +0000 UTC m=+0.123029642 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:06:43 np0005588919 nova_compute[225855]: 2026-01-20 15:06:43.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:43 np0005588919 nova_compute[225855]: 2026-01-20 15:06:43.896 225859 DEBUG nova.network.neutron [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updating instance_info_cache with network_info: [{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:43 np0005588919 nova_compute[225855]: 2026-01-20 15:06:43.924 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Releasing lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:44 np0005588919 nova_compute[225855]: 2026-01-20 15:06:44.211 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:06:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:44.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:44Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:c0:d3 10.100.0.3
Jan 20 10:06:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:06:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:44.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:06:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:46.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:46.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:47 np0005588919 nova_compute[225855]: 2026-01-20 15:06:47.437 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:06:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:47 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:06:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:48.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:48 np0005588919 nova_compute[225855]: 2026-01-20 15:06:48.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:48 np0005588919 nova_compute[225855]: 2026-01-20 15:06:48.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:48 np0005588919 nova_compute[225855]: 2026-01-20 15:06:48.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:48 np0005588919 nova_compute[225855]: 2026-01-20 15:06:48.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:06:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:48.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:49Z|00699|memory|INFO|peak resident set size grew 50% in last 3527.6 seconds, from 16256 kB to 24392 kB
Jan 20 10:06:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:06:49Z|00700|memory|INFO|idl-cells-OVN_Southbound:11021 idl-cells-Open_vSwitch:1041 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:395 lflow-cache-entries-cache-matches:294 lflow-cache-size-KB:1610 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:678 ofctrl_installed_flow_usage-KB:496 ofctrl_sb_flow_ref_usage-KB:256
Jan 20 10:06:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:50.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:50 np0005588919 nova_compute[225855]: 2026-01-20 15:06:50.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:06:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:50.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:06:51 np0005588919 nova_compute[225855]: 2026-01-20 15:06:51.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:51 np0005588919 nova_compute[225855]: 2026-01-20 15:06:51.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:06:51 np0005588919 nova_compute[225855]: 2026-01-20 15:06:51.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:06:52 np0005588919 nova_compute[225855]: 2026-01-20 15:06:52.009 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:06:52 np0005588919 nova_compute[225855]: 2026-01-20 15:06:52.010 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:06:52 np0005588919 nova_compute[225855]: 2026-01-20 15:06:52.010 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:06:52 np0005588919 nova_compute[225855]: 2026-01-20 15:06:52.010 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:06:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:52.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:52 np0005588919 nova_compute[225855]: 2026-01-20 15:06:52.477 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:52.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.173 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.188 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.189 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.189 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.189 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:53 np0005588919 nova_compute[225855]: 2026-01-20 15:06:53.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:54 np0005588919 nova_compute[225855]: 2026-01-20 15:06:54.252 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:06:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:54.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:54 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:06:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:54.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:56.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.390 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.390 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:56.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1577499332' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.821 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.903 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.903 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.906 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.906 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.909 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:56 np0005588919 nova_compute[225855]: 2026-01-20 15:06:56.909 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.078 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.080 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3747MB free_disk=20.825679779052734GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.080 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.080 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.180 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.181 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 5feeb9de-434b-4ec7-aa99-6da718514c6f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.181 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 5f8a2718-2106-431c-82c1-2609a52e7fb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.181 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.181 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.281 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:06:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4168235077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.716 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.722 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.740 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.761 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:06:57 np0005588919 nova_compute[225855]: 2026-01-20 15:06:57.761 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:06:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:06:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:06:58.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:58 np0005588919 nova_compute[225855]: 2026-01-20 15:06:58.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:06:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:06:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:06:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:06:58.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:06:59 np0005588919 podman[292642]: 2026-01-20 15:06:59.041017981 +0000 UTC m=+0.088924145 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 10:06:59 np0005588919 nova_compute[225855]: 2026-01-20 15:06:59.756 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:00.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:00 np0005588919 nova_compute[225855]: 2026-01-20 15:07:00.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:00.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:02.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:02 np0005588919 nova_compute[225855]: 2026-01-20 15:07:02.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:02.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:03 np0005588919 nova_compute[225855]: 2026-01-20 15:07:03.309 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:04.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:04.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:05 np0005588919 nova_compute[225855]: 2026-01-20 15:07:05.293 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:07:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:06.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:06.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:07 np0005588919 nova_compute[225855]: 2026-01-20 15:07:07.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:07 np0005588919 kernel: tap6e7af943-7e (unregistering): left promiscuous mode
Jan 20 10:07:07 np0005588919 NetworkManager[49104]: <info>  [1768921627.6022] device (tap6e7af943-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:07 np0005588919 nova_compute[225855]: 2026-01-20 15:07:07.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:07Z|00701|binding|INFO|Releasing lport 6e7af943-7ef0-441d-a402-bd595082f98e from this chassis (sb_readonly=0)
Jan 20 10:07:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:07Z|00702|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e down in Southbound
Jan 20 10:07:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:07Z|00703|binding|INFO|Removing iface tap6e7af943-7e ovn-installed in OVS
Jan 20 10:07:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:07.625 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:07.628 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 unbound from our chassis#033[00m
Jan 20 10:07:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:07.629 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:07.631 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5650f4d5-c3b8-408d-9fc1-6d44449c7093]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:07 np0005588919 nova_compute[225855]: 2026-01-20 15:07:07.652 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:07 np0005588919 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 20 10:07:07 np0005588919 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a5.scope: Consumed 15.990s CPU time.
Jan 20 10:07:07 np0005588919 systemd-machined[194361]: Machine qemu-82-instance-000000a5 terminated.
Jan 20 10:07:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.220 225859 DEBUG nova.compute.manager [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.222 225859 DEBUG oslo_concurrency.lockutils [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.222 225859 DEBUG oslo_concurrency.lockutils [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.222 225859 DEBUG oslo_concurrency.lockutils [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.223 225859 DEBUG nova.compute.manager [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.223 225859 WARNING nova.compute.manager [req-ecc9b2c8-a707-41b3-b495-d831e828ed4f req-d1e29016-1c31-4f11-bffd-385fbac493a4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:07:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:08.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.305 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance shutdown successfully after 24 seconds.#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.311 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance destroyed successfully.#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.311 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.332 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Attempting rescue#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.333 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.337 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.337 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Creating image(s)#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.361 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.365 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.401 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.425 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.429 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.491 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.492 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.493 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.493 225859 DEBUG oslo_concurrency.lockutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.517 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.521 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:08.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.853 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.332s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.854 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.875 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.876 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start _get_guest_xml network_info=[{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:21:8b:e2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.876 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'resources' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.878 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.878 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.878 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.879 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.879 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.881 225859 INFO nova.compute.manager [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Terminating instance#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.882 225859 DEBUG nova.compute.manager [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.901 225859 WARNING nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.911 225859 DEBUG nova.virt.libvirt.host [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.912 225859 DEBUG nova.virt.libvirt.host [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.915 225859 DEBUG nova.virt.libvirt.host [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.916 225859 DEBUG nova.virt.libvirt.host [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.917 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.917 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.917 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.917 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.918 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.919 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.919 225859 DEBUG nova.virt.hardware [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.919 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:08 np0005588919 kernel: tap70668adb-f9 (unregistering): left promiscuous mode
Jan 20 10:07:08 np0005588919 NetworkManager[49104]: <info>  [1768921628.9269] device (tap70668adb-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.932 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:08Z|00704|binding|INFO|Releasing lport 70668adb-f9ad-41cb-8eac-2e0aba32bf22 from this chassis (sb_readonly=0)
Jan 20 10:07:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:08Z|00705|binding|INFO|Setting lport 70668adb-f9ad-41cb-8eac-2e0aba32bf22 down in Southbound
Jan 20 10:07:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:08Z|00706|binding|INFO|Removing iface tap70668adb-f9 ovn-installed in OVS
Jan 20 10:07:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:08.950 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c0:d3 10.100.0.3'], port_security=['fa:16:3e:6a:c0:d3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5feeb9de-434b-4ec7-aa99-6da718514c6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f434e83-45c8-454d-820b-af39b696a1d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0fc924d2df984301897e81920c5e192f', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'd8d958e0-892e-4275-9633-96783d5a96b0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0669b1-9b02-4bfa-859e-dac906b93fdc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=70668adb-f9ad-41cb-8eac-2e0aba32bf22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:08.951 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 70668adb-f9ad-41cb-8eac-2e0aba32bf22 in datapath 0f434e83-45c8-454d-820b-af39b696a1d5 unbound from our chassis#033[00m
Jan 20 10:07:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:08.952 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f434e83-45c8-454d-820b-af39b696a1d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:07:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:08.953 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[33386d12-9fe3-4be1-af46-b564f7cb7c9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:08.954 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 namespace which is not needed anymore#033[00m
Jan 20 10:07:08 np0005588919 nova_compute[225855]: 2026-01-20 15:07:08.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:08 np0005588919 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 20 10:07:08 np0005588919 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a2.scope: Consumed 14.279s CPU time.
Jan 20 10:07:08 np0005588919 systemd-machined[194361]: Machine qemu-81-instance-000000a2 terminated.
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.055 225859 DEBUG nova.compute.manager [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-changed-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.057 225859 DEBUG nova.compute.manager [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Refreshing instance network info cache due to event network-changed-70668adb-f9ad-41cb-8eac-2e0aba32bf22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.057 225859 DEBUG oslo_concurrency.lockutils [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.058 225859 DEBUG oslo_concurrency.lockutils [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.058 225859 DEBUG nova.network.neutron [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Refreshing network info cache for port 70668adb-f9ad-41cb-8eac-2e0aba32bf22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.114 225859 INFO nova.virt.libvirt.driver [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Instance destroyed successfully.#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.115 225859 DEBUG nova.objects.instance [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lazy-loading 'resources' on Instance uuid 5feeb9de-434b-4ec7-aa99-6da718514c6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.136 225859 DEBUG nova.virt.libvirt.vif [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-20T15:05:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-670486896',display_name='tempest-TestShelveInstance-server-670486896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-670486896',id=162,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGFkn7BLthBV1Y62q/iaiaFYVNSXov56cyC6gJDof3vS0dj6UwuVwvMnqOok2l8W+oqb55YucgjGf+63NOxxoSCxoRUO/Jcx5MarGHmdQPdT+6u18ixvV1ghiExv/Y0Nog==',key_name='tempest-TestShelveInstance-1862119958',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:06:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0fc924d2df984301897e81920c5e192f',ramdisk_id='',reservation_id='r-18qxw31n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-1425544575',owner_user_name='tempest-TestShelveInstance-1425544575-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:06:31Z,user_data=None,user_id='b02a8ef6cc3946ceb2c8846aae2eae68',uuid=5feeb9de-434b-4ec7-aa99-6da718514c6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.137 225859 DEBUG nova.network.os_vif_util [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converting VIF {"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.138 225859 DEBUG nova.network.os_vif_util [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.139 225859 DEBUG os_vif [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.142 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.142 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70668adb-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.145 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.147 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.150 225859 INFO os_vif [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c0:d3,bridge_name='br-int',has_traffic_filtering=True,id=70668adb-f9ad-41cb-8eac-2e0aba32bf22,network=Network(0f434e83-45c8-454d-820b-af39b696a1d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70668adb-f9')#033[00m
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [NOTICE]   (291896) : haproxy version is 2.8.14-c23fe91
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [NOTICE]   (291896) : path to executable is /usr/sbin/haproxy
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [WARNING]  (291896) : Exiting Master process...
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [WARNING]  (291896) : Exiting Master process...
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [ALERT]    (291896) : Current worker (291898) exited with code 143 (Terminated)
Jan 20 10:07:09 np0005588919 neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5[291892]: [WARNING]  (291896) : All workers exited. Exiting... (0)
Jan 20 10:07:09 np0005588919 systemd[1]: libpod-7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa.scope: Deactivated successfully.
Jan 20 10:07:09 np0005588919 podman[292809]: 2026-01-20 15:07:09.231113249 +0000 UTC m=+0.181694596 container died 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:07:09 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a2da085baf4d307a9cd37776373e9378476598690798d692b40336094440304d-merged.mount: Deactivated successfully.
Jan 20 10:07:09 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa-userdata-shm.mount: Deactivated successfully.
Jan 20 10:07:09 np0005588919 podman[292809]: 2026-01-20 15:07:09.284298998 +0000 UTC m=+0.234880315 container cleanup 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:07:09 np0005588919 systemd[1]: libpod-conmon-7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa.scope: Deactivated successfully.
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:09 np0005588919 podman[292888]: 2026-01-20 15:07:09.35272743 +0000 UTC m=+0.047513889 container remove 7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.358 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aae38b1b-dc45-44dc-901c-dd460bf80b61]: (4, ('Tue Jan 20 03:07:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa)\n7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa\nTue Jan 20 03:07:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 (7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa)\n7a60dbc62e73615be07404bd95899ae47d0c88a0458e87e1347e628a09141daa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.360 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[94f37c74-d184-46cb-a0d6-bccd2e801dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.361 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f434e83-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:09 np0005588919 kernel: tap0f434e83-40: left promiscuous mode
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3574389037' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.383 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e50404a-7be7-4c1d-ae37-a760f69de589]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.398 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[82066958-ef94-47ae-a70e-0f784e8e1acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.399 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.400 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9be82b8b-77ef-4950-bdb1-3b0841699140]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.400 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.415 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1ae3da-b748-4ff6-88ef-c51c3a738264]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662960, 'reachable_time': 28584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292911, 'error': None, 'target': 'ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.417 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f434e83-45c8-454d-820b-af39b696a1d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:07:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:09.417 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[f92d946b-3a96-4562-9534-cf0ff2a9bc1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:09 np0005588919 systemd[1]: run-netns-ovnmeta\x2d0f434e83\x2d45c8\x2d454d\x2d820b\x2daf39b696a1d5.mount: Deactivated successfully.
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.433 225859 INFO nova.virt.libvirt.driver [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Deleting instance files /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f_del#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.435 225859 INFO nova.virt.libvirt.driver [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Deletion of /var/lib/nova/instances/5feeb9de-434b-4ec7-aa99-6da718514c6f_del complete#033[00m
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2868866756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.482 225859 INFO nova.compute.manager [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.482 225859 DEBUG oslo.service.loopingcall [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.482 225859 DEBUG nova.compute.manager [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.483 225859 DEBUG nova.network.neutron [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:07:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3150868772' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.839 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:09 np0005588919 nova_compute[225855]: 2026-01-20 15:07:09.840 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:07:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/272620387' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.258 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.260 225859 DEBUG nova.virt.libvirt.vif [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:06:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1315322326',display_name='tempest-ServerRescueTestJSON-server-1315322326',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1315322326',id=165,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:06:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-g1s28307',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:06:40Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=5f8a2718-2106-431c-82c1-2609a52e7fb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:21:8b:e2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.261 225859 DEBUG nova.network.os_vif_util [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-3053955-network", "vif_mac": "fa:16:3e:21:8b:e2"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.262 225859 DEBUG nova.network.os_vif_util [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.265 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.283 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <uuid>5f8a2718-2106-431c-82c1-2609a52e7fb2</uuid>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <name>instance-000000a5</name>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueTestJSON-server-1315322326</nova:name>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:07:08</nova:creationTime>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:user uuid="a2beb3d6247e457abd6e8d93cc602f02">tempest-ServerRescueTestJSON-1151598672-project-member</nova:user>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:project uuid="5e161d5a47f845fd89eb3f10627a0830">tempest-ServerRescueTestJSON-1151598672</nova:project>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <nova:port uuid="6e7af943-7ef0-441d-a402-bd595082f98e">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="serial">5f8a2718-2106-431c-82c1-2609a52e7fb2</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="uuid">5f8a2718-2106-431c-82c1-2609a52e7fb2</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.rescue">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f8a2718-2106-431c-82c1-2609a52e7fb2_disk">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config.rescue">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:21:8b:e2"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <target dev="tap6e7af943-7e"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/console.log" append="off"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:07:10 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:07:10 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:07:10 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:07:10 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:07:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:10.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.290 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance destroyed successfully.#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.345 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.345 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.345 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 WARNING nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-unplugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.346 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] No waiting events found dispatching network-vif-unplugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-unplugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.347 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.348 225859 DEBUG oslo_concurrency.lockutils [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.348 225859 DEBUG nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] No waiting events found dispatching network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.348 225859 WARNING nova.compute.manager [req-05a36f30-c49b-460b-a1c2-d4abe25b8acb req-95220ae8-b1d5-4b8d-a02d-d4cf6450ddf7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received unexpected event network-vif-plugged-70668adb-f9ad-41cb-8eac-2e0aba32bf22 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.360 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.360 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.360 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.361 225859 DEBUG nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] No VIF found with MAC fa:16:3e:21:8b:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.361 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Using config drive#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.385 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.406 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:10 np0005588919 nova_compute[225855]: 2026-01-20 15:07:10.435 225859 DEBUG nova.objects.instance [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'keypairs' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:10.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.075 225859 DEBUG nova.network.neutron [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.093 225859 INFO nova.compute.manager [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Took 1.61 seconds to deallocate network for instance.#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.205 225859 DEBUG nova.compute.manager [req-1c84b992-2071-4344-8efa-5dd5f6f9f46d req-e91bb6b5-196a-4585-a0cd-a854da354926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Received event network-vif-deleted-70668adb-f9ad-41cb-8eac-2e0aba32bf22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.383 225859 INFO nova.compute.manager [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Took 0.29 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.385 225859 DEBUG nova.compute.manager [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Deleting volume: 94300d81-b4ca-4c0a-9283-83b76826d40f _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.422 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Creating config drive at /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.427 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl297ss98 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.561 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl297ss98" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.586 225859 DEBUG nova.storage.rbd_utils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] rbd image 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.590 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.616 225859 DEBUG nova.network.neutron [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updated VIF entry in instance network info cache for port 70668adb-f9ad-41cb-8eac-2e0aba32bf22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.617 225859 DEBUG nova.network.neutron [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Updating instance_info_cache with network_info: [{"id": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "address": "fa:16:3e:6a:c0:d3", "network": {"id": "0f434e83-45c8-454d-820b-af39b696a1d5", "bridge": "br-int", "label": "tempest-TestShelveInstance-1862824485-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0fc924d2df984301897e81920c5e192f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70668adb-f9", "ovs_interfaceid": "70668adb-f9ad-41cb-8eac-2e0aba32bf22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.619 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.619 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.639 225859 DEBUG oslo_concurrency.lockutils [req-545408b4-58ab-4abc-b542-a6874fa00bf3 req-9d3fdaa7-2978-4034-b6b0-df47a2cb3cca 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-5feeb9de-434b-4ec7-aa99-6da718514c6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.729 225859 DEBUG oslo_concurrency.processutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:07:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1005697138' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:07:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:07:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1005697138' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.916 225859 DEBUG oslo_concurrency.processutils [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue 5f8a2718-2106-431c-82c1-2609a52e7fb2_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.917 225859 INFO nova.virt.libvirt.driver [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Deleting local config drive /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:07:11 np0005588919 NetworkManager[49104]: <info>  [1768921631.9743] manager: (tap6e7af943-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Jan 20 10:07:11 np0005588919 kernel: tap6e7af943-7e: entered promiscuous mode
Jan 20 10:07:11 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:11Z|00707|binding|INFO|Claiming lport 6e7af943-7ef0-441d-a402-bd595082f98e for this chassis.
Jan 20 10:07:11 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:11Z|00708|binding|INFO|6e7af943-7ef0-441d-a402-bd595082f98e: Claiming fa:16:3e:21:8b:e2 10.100.0.13
Jan 20 10:07:11 np0005588919 nova_compute[225855]: 2026-01-20 15:07:11.984 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:11.990 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:11.992 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 bound to our chassis#033[00m
Jan 20 10:07:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:11.994 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:11.995 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[21d4b404-b079-4df1-8c3a-e1402f6f76b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:12 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:12Z|00709|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e up in Southbound
Jan 20 10:07:12 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:12Z|00710|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e ovn-installed in OVS
Jan 20 10:07:12 np0005588919 systemd-machined[194361]: New machine qemu-83-instance-000000a5.
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:12 np0005588919 systemd[1]: Started Virtual Machine qemu-83-instance-000000a5.
Jan 20 10:07:12 np0005588919 systemd-udevd[293051]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:07:12 np0005588919 NetworkManager[49104]: <info>  [1768921632.0639] device (tap6e7af943-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:07:12 np0005588919 NetworkManager[49104]: <info>  [1768921632.0651] device (tap6e7af943-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1766338785' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.179 225859 DEBUG oslo_concurrency.processutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.185 225859 DEBUG nova.compute.provider_tree [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.211 225859 DEBUG nova.scheduler.client.report [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.237 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:12.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.296 225859 INFO nova.scheduler.client.report [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Deleted allocations for instance 5feeb9de-434b-4ec7-aa99-6da718514c6f#033[00m
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3016473623' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3016473623' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.389 225859 DEBUG oslo_concurrency.lockutils [None req-6697113e-e9a3-43ee-bf8b-903b9d311a88 b02a8ef6cc3946ceb2c8846aae2eae68 0fc924d2df984301897e81920c5e192f - - default default] Lock "5feeb9de-434b-4ec7-aa99-6da718514c6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.439 225859 DEBUG nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.439 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.439 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.441 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.441 225859 DEBUG nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.442 225859 WARNING nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.442 225859 DEBUG nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.442 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.442 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.443 225859 DEBUG oslo_concurrency.lockutils [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.443 225859 DEBUG nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.443 225859 WARNING nova.compute.manager [req-5330001c-adbf-4d70-881f-ddbd6f048873 req-d26baadf-b86a-49ec-bdbd-2b1c4395c2ec 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.486 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.488714) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632488783, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 798, "num_deletes": 251, "total_data_size": 1410236, "memory_usage": 1432560, "flush_reason": "Manual Compaction"}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632503019, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 930086, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59908, "largest_seqno": 60701, "table_properties": {"data_size": 926352, "index_size": 1514, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8947, "raw_average_key_size": 19, "raw_value_size": 918689, "raw_average_value_size": 2028, "num_data_blocks": 67, "num_entries": 453, "num_filter_entries": 453, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921581, "oldest_key_time": 1768921581, "file_creation_time": 1768921632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 14328 microseconds, and 2940 cpu microseconds.
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.503049) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 930086 bytes OK
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.503115) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505356) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505368) EVENT_LOG_v1 {"time_micros": 1768921632505365, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505386) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 1406019, prev total WAL file size 1406019, number of live WAL files 2.
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505897) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(908KB)], [117(12MB)]
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632505929, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 13519081, "oldest_snapshot_seqno": -1}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8534 keys, 11606337 bytes, temperature: kUnknown
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632654993, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 11606337, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11549916, "index_size": 33973, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21381, "raw_key_size": 222268, "raw_average_key_size": 26, "raw_value_size": 11398608, "raw_average_value_size": 1335, "num_data_blocks": 1323, "num_entries": 8534, "num_filter_entries": 8534, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.655232) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 11606337 bytes
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656850) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.6 rd, 77.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(27.0) write-amplify(12.5) OK, records in: 9049, records dropped: 515 output_compression: NoCompression
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.656881) EVENT_LOG_v1 {"time_micros": 1768921632656873, "job": 74, "event": "compaction_finished", "compaction_time_micros": 149145, "compaction_time_cpu_micros": 26851, "output_level": 6, "num_output_files": 1, "total_output_size": 11606337, "num_input_records": 9049, "num_output_records": 8534, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632657255, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921632659741, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.505820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.659793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.659796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.659798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.659799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:07:12.659801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:07:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:12.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.850 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 5f8a2718-2106-431c-82c1-2609a52e7fb2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.851 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921632.8498917, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.851 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.856 225859 DEBUG nova.compute.manager [None req-66758f5d-5806-48e6-86ec-5a05118ea96c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.886 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.889 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.918 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.918 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921632.8517895, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.918 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Started (Lifecycle Event)#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.946 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:12 np0005588919 nova_compute[225855]: 2026-01-20 15:07:12.949 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:07:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 20 10:07:14 np0005588919 podman[293122]: 2026-01-20 15:07:14.015308064 +0000 UTC m=+0.052950754 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 10:07:14 np0005588919 nova_compute[225855]: 2026-01-20 15:07:14.146 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:14.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 20 10:07:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:14.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:14 np0005588919 nova_compute[225855]: 2026-01-20 15:07:14.924 225859 INFO nova.compute.manager [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Unrescuing#033[00m
Jan 20 10:07:14 np0005588919 nova_compute[225855]: 2026-01-20 15:07:14.925 225859 DEBUG oslo_concurrency.lockutils [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:14 np0005588919 nova_compute[225855]: 2026-01-20 15:07:14.925 225859 DEBUG oslo_concurrency.lockutils [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquired lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:07:14 np0005588919 nova_compute[225855]: 2026-01-20 15:07:14.925 225859 DEBUG nova.network.neutron [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:07:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:16.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:16.298 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:16.299 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:07:16 np0005588919 nova_compute[225855]: 2026-01-20 15:07:16.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:16.426 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:16.427 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:16.427 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:16.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.224 225859 DEBUG nova.compute.manager [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-changed-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.225 225859 DEBUG nova.compute.manager [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing instance network info cache due to event network-changed-070862f1-1db2-45c2-9787-752e6d88449a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.225 225859 DEBUG oslo_concurrency.lockutils [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.225 225859 DEBUG oslo_concurrency.lockutils [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.225 225859 DEBUG nova.network.neutron [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Refreshing network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.301 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.328 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.328 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.329 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.329 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.329 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.330 225859 INFO nova.compute.manager [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Terminating instance#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.331 225859 DEBUG nova.compute.manager [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:07:17 np0005588919 kernel: tap070862f1-1d (unregistering): left promiscuous mode
Jan 20 10:07:17 np0005588919 NetworkManager[49104]: <info>  [1768921637.4007] device (tap070862f1-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:17 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:17Z|00711|binding|INFO|Releasing lport 070862f1-1db2-45c2-9787-752e6d88449a from this chassis (sb_readonly=0)
Jan 20 10:07:17 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:17Z|00712|binding|INFO|Setting lport 070862f1-1db2-45c2-9787-752e6d88449a down in Southbound
Jan 20 10:07:17 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:17Z|00713|binding|INFO|Removing iface tap070862f1-1d ovn-installed in OVS
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.419 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:e7:09 10.100.0.7'], port_security=['fa:16:3e:e5:e7:09 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33ba7a73-3233-40a3-a49a-e5bbd604dc3c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e142d118583b4f9ba3531bcf3838e256', 'neutron:revision_number': '4', 'neutron:security_group_ids': '37efc868-18af-48b7-8d56-e37fd1ec4df0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f9deb561-4473-4aa7-8b6f-d70e20e7cf6d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=070862f1-1db2-45c2-9787-752e6d88449a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.421 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 070862f1-1db2-45c2-9787-752e6d88449a in datapath 8472bae1-476b-4100-b9fa-e8827bc4f7bf unbound from our chassis#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.423 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8472bae1-476b-4100-b9fa-e8827bc4f7bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.424 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6807c5-8b88-4ea3-a10b-ed83a592a9f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.425 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf namespace which is not needed anymore#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.428 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 20 10:07:17 np0005588919 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a1.scope: Consumed 17.971s CPU time.
Jan 20 10:07:17 np0005588919 systemd-machined[194361]: Machine qemu-80-instance-000000a1 terminated.
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.488 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.572 225859 INFO nova.virt.libvirt.driver [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Instance destroyed successfully.#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.573 225859 DEBUG nova.objects.instance [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lazy-loading 'resources' on Instance uuid 33ba7a73-3233-40a3-a49a-e5bbd604dc3c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [NOTICE]   (290921) : haproxy version is 2.8.14-c23fe91
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [NOTICE]   (290921) : path to executable is /usr/sbin/haproxy
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [WARNING]  (290921) : Exiting Master process...
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [WARNING]  (290921) : Exiting Master process...
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [ALERT]    (290921) : Current worker (290938) exited with code 143 (Terminated)
Jan 20 10:07:17 np0005588919 neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf[290899]: [WARNING]  (290921) : All workers exited. Exiting... (0)
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.593 225859 DEBUG nova.virt.libvirt.vif [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestStampPattern-server-2111868448',display_name='tempest-TestStampPattern-server-2111868448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-teststamppattern-server-2111868448',id=161,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHW99EAKkcMHbb6foGeGxm9beD/C9AeSuQLW3fqIuoocya0hep1/utcjh4cUxZzvt5K+5yMQG3K45jiLKihqKM6cawBqTQvgzcywKN5pk06AjS3tvq9GuiAvDAys6caVkA==',key_name='tempest-TestStampPattern-1928143162',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:05:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e142d118583b4f9ba3531bcf3838e256',ramdisk_id='',reservation_id='r-7ei3hy41',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestStampPattern-487600181',owner_user_name='tempest-TestStampPattern-487600181-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:06:06Z,user_data=None,user_id='bc554998e71a4322bdd27ac727a9044c',uuid=33ba7a73-3233-40a3-a49a-e5bbd604dc3c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.594 225859 DEBUG nova.network.os_vif_util [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converting VIF {"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:07:17 np0005588919 systemd[1]: libpod-49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd.scope: Deactivated successfully.
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.594 225859 DEBUG nova.network.os_vif_util [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.595 225859 DEBUG os_vif [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.597 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.597 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap070862f1-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.598 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:07:17 np0005588919 podman[293166]: 2026-01-20 15:07:17.601447975 +0000 UTC m=+0.071063427 container died 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.603 225859 INFO os_vif [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:e7:09,bridge_name='br-int',has_traffic_filtering=True,id=070862f1-1db2-45c2-9787-752e6d88449a,network=Network(8472bae1-476b-4100-b9fa-e8827bc4f7bf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070862f1-1d')#033[00m
Jan 20 10:07:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd-userdata-shm.mount: Deactivated successfully.
Jan 20 10:07:17 np0005588919 systemd[1]: var-lib-containers-storage-overlay-5103777890e53183e892eb29a7309526c22c2d478bcad4b3e03ce4ad7bb2fbd2-merged.mount: Deactivated successfully.
Jan 20 10:07:17 np0005588919 podman[293166]: 2026-01-20 15:07:17.648642234 +0000 UTC m=+0.118257686 container cleanup 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:07:17 np0005588919 systemd[1]: libpod-conmon-49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd.scope: Deactivated successfully.
Jan 20 10:07:17 np0005588919 podman[293225]: 2026-01-20 15:07:17.723169739 +0000 UTC m=+0.047405786 container remove 49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.730 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb4145f-118c-4765-8f7a-b21b1a205dfc]: (4, ('Tue Jan 20 03:07:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf (49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd)\n49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd\nTue Jan 20 03:07:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf (49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd)\n49cb1fc9eefac14d88a80a448d309dbf7c2e07925d5c789c364c5dddff3d7edd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.737 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[627e22cd-9394-42be-8175-c375b1c2bc14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.738 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8472bae1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:17 np0005588919 kernel: tap8472bae1-40: left promiscuous mode
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.740 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.746 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad1f3b8-593c-4600-9376-4ceeabd5c84f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.759 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.771 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f95f5505-1d6f-4765-a39b-3aa549cb8912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.773 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6033c587-585b-45e8-8b4f-73ace45b67dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.787 225859 DEBUG nova.compute.manager [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-unplugged-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.788 225859 DEBUG oslo_concurrency.lockutils [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.788 225859 DEBUG oslo_concurrency.lockutils [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.788 225859 DEBUG oslo_concurrency.lockutils [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.788 225859 DEBUG nova.compute.manager [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] No waiting events found dispatching network-vif-unplugged-070862f1-1db2-45c2-9787-752e6d88449a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:17 np0005588919 nova_compute[225855]: 2026-01-20 15:07:17.788 225859 DEBUG nova.compute.manager [req-c24f78e0-362e-4656-bda9-7588239ba960 req-bbe01543-8357-45cf-927d-c42af224c790 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-unplugged-070862f1-1db2-45c2-9787-752e6d88449a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.790 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4e350b79-f7f8-45a5-b1a7-b006bc7de2c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656864, 'reachable_time': 32489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293253, 'error': None, 'target': 'ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.792 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8472bae1-476b-4100-b9fa-e8827bc4f7bf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:07:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:17.792 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[b43bbb6f-f442-4e89-a002-eb5774f4f62e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:17 np0005588919 systemd[1]: run-netns-ovnmeta\x2d8472bae1\x2d476b\x2d4100\x2db9fa\x2de8827bc4f7bf.mount: Deactivated successfully.
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.070 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.106 225859 INFO nova.virt.libvirt.driver [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Deleting instance files /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c_del#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.107 225859 INFO nova.virt.libvirt.driver [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Deletion of /var/lib/nova/instances/33ba7a73-3233-40a3-a49a-e5bbd604dc3c_del complete#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.181 225859 INFO nova.compute.manager [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.182 225859 DEBUG oslo.service.loopingcall [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.182 225859 DEBUG nova.compute.manager [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.182 225859 DEBUG nova.network.neutron [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.293 225859 DEBUG nova.network.neutron [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updating instance_info_cache with network_info: [{"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:18.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.372 225859 DEBUG oslo_concurrency.lockutils [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Releasing lock "refresh_cache-5f8a2718-2106-431c-82c1-2609a52e7fb2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.373 225859 DEBUG nova.objects.instance [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'flavor' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.409 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 kernel: tap6e7af943-7e (unregistering): left promiscuous mode
Jan 20 10:07:18 np0005588919 NetworkManager[49104]: <info>  [1768921638.4630] device (tap6e7af943-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00714|binding|INFO|Releasing lport 6e7af943-7ef0-441d-a402-bd595082f98e from this chassis (sb_readonly=0)
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00715|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e down in Southbound
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00716|binding|INFO|Removing iface tap6e7af943-7e ovn-installed in OVS
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.468 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.469 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.476 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.477 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 unbound from our chassis#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.478 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.479 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[947b88bf-015b-4bab-86cc-86b71118186b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 20 10:07:18 np0005588919 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000a5.scope: Consumed 6.393s CPU time.
Jan 20 10:07:18 np0005588919 systemd-machined[194361]: Machine qemu-83-instance-000000a5 terminated.
Jan 20 10:07:18 np0005588919 NetworkManager[49104]: <info>  [1768921638.6374] manager: (tap6e7af943-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.661 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance destroyed successfully.#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.662 225859 DEBUG nova.objects.instance [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:18.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:18 np0005588919 kernel: tap6e7af943-7e: entered promiscuous mode
Jan 20 10:07:18 np0005588919 NetworkManager[49104]: <info>  [1768921638.8600] manager: (tap6e7af943-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Jan 20 10:07:18 np0005588919 systemd-udevd[293143]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.862 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00717|binding|INFO|Claiming lport 6e7af943-7ef0-441d-a402-bd595082f98e for this chassis.
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00718|binding|INFO|6e7af943-7ef0-441d-a402-bd595082f98e: Claiming fa:16:3e:21:8b:e2 10.100.0.13
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.868 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.870 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 bound to our chassis#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.871 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:18.871 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4872e6a1-c331-449f-899f-55723bcb659e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:18 np0005588919 NetworkManager[49104]: <info>  [1768921638.8756] device (tap6e7af943-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:07:18 np0005588919 NetworkManager[49104]: <info>  [1768921638.8771] device (tap6e7af943-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00719|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e ovn-installed in OVS
Jan 20 10:07:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:18Z|00720|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e up in Southbound
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.889 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.891 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 nova_compute[225855]: 2026-01-20 15:07:18.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:18 np0005588919 systemd-machined[194361]: New machine qemu-84-instance-000000a5.
Jan 20 10:07:18 np0005588919 systemd[1]: Started Virtual Machine qemu-84-instance-000000a5.
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.378 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 5f8a2718-2106-431c-82c1-2609a52e7fb2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.379 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921639.3776076, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.380 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.410 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.417 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.455 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.456 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921639.3822367, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.456 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Started (Lifecycle Event)#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.477 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.482 225859 DEBUG nova.compute.manager [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.483 225859 DEBUG oslo_concurrency.lockutils [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.483 225859 DEBUG oslo_concurrency.lockutils [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.483 225859 DEBUG oslo_concurrency.lockutils [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.484 225859 DEBUG nova.compute.manager [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.484 225859 WARNING nova.compute.manager [req-7f2f672b-c6f4-445b-abf2-d10afb883840 req-016a1ffe-f196-42f6-84e7-acd4a623fb72 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.489 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.514 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.879 225859 DEBUG nova.compute.manager [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.880 225859 DEBUG oslo_concurrency.lockutils [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.880 225859 DEBUG oslo_concurrency.lockutils [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.880 225859 DEBUG oslo_concurrency.lockutils [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.880 225859 DEBUG nova.compute.manager [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] No waiting events found dispatching network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.881 225859 WARNING nova.compute.manager [req-8fa37a04-c694-40aa-8d25-7b9708fd6a59 req-e6e0042a-efd7-476f-bbe4-20cc0d13487a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received unexpected event network-vif-plugged-070862f1-1db2-45c2-9787-752e6d88449a for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.973 225859 DEBUG nova.network.neutron [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updated VIF entry in instance network info cache for port 070862f1-1db2-45c2-9787-752e6d88449a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:07:19 np0005588919 nova_compute[225855]: 2026-01-20 15:07:19.974 225859 DEBUG nova.network.neutron [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [{"id": "070862f1-1db2-45c2-9787-752e6d88449a", "address": "fa:16:3e:e5:e7:09", "network": {"id": "8472bae1-476b-4100-b9fa-e8827bc4f7bf", "bridge": "br-int", "label": "tempest-TestStampPattern-1138931002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e142d118583b4f9ba3531bcf3838e256", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070862f1-1d", "ovs_interfaceid": "070862f1-1db2-45c2-9787-752e6d88449a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.009 225859 DEBUG oslo_concurrency.lockutils [req-217159b0-8069-4aa8-b849-e25238e89e71 req-5fe5a655-30ec-4e9d-a814-d83d9f3d08a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-33ba7a73-3233-40a3-a49a-e5bbd604dc3c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.059 225859 DEBUG nova.network.neutron [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.090 225859 INFO nova.compute.manager [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Took 1.91 seconds to deallocate network for instance.#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.144 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.144 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.189 225859 DEBUG nova.compute.manager [None req-e7e5be06-3ee0-449a-a5a4-42a0cfebbe4c a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.239 225859 DEBUG oslo_concurrency.processutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:20.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3481712973' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.698 225859 DEBUG oslo_concurrency.processutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.704 225859 DEBUG nova.compute.provider_tree [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:20.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.732 225859 DEBUG nova.scheduler.client.report [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.767 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.794 225859 INFO nova.scheduler.client.report [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Deleted allocations for instance 33ba7a73-3233-40a3-a49a-e5bbd604dc3c#033[00m
Jan 20 10:07:20 np0005588919 nova_compute[225855]: 2026-01-20 15:07:20.860 225859 DEBUG oslo_concurrency.lockutils [None req-b7e9202a-dde3-4024-80ac-056613d9de92 bc554998e71a4322bdd27ac727a9044c e142d118583b4f9ba3531bcf3838e256 - - default default] Lock "33ba7a73-3233-40a3-a49a-e5bbd604dc3c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.194 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.195 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.195 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.196 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.196 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.197 225859 INFO nova.compute.manager [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Terminating instance#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.198 225859 DEBUG nova.compute.manager [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:07:21 np0005588919 kernel: tap6e7af943-7e (unregistering): left promiscuous mode
Jan 20 10:07:21 np0005588919 NetworkManager[49104]: <info>  [1768921641.5096] device (tap6e7af943-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:07:21 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:21Z|00721|binding|INFO|Releasing lport 6e7af943-7ef0-441d-a402-bd595082f98e from this chassis (sb_readonly=0)
Jan 20 10:07:21 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:21Z|00722|binding|INFO|Setting lport 6e7af943-7ef0-441d-a402-bd595082f98e down in Southbound
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.540 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588919 ovn_controller[130490]: 2026-01-20T15:07:21Z|00723|binding|INFO|Removing iface tap6e7af943-7e ovn-installed in OVS
Jan 20 10:07:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:21.546 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8b:e2 10.100.0.13'], port_security=['fa:16:3e:21:8b:e2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5f8a2718-2106-431c-82c1-2609a52e7fb2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bac39b9-563a-456f-9168-fd10b1b28c21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e161d5a47f845fd89eb3f10627a0830', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cd72b979-cfcf-4dbd-bbff-8e22cd1b4096', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c853729a-de72-4ddb-be59-bc41e08984ce, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6e7af943-7ef0-441d-a402-bd595082f98e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:07:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:21.547 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6e7af943-7ef0-441d-a402-bd595082f98e in datapath 5bac39b9-563a-456f-9168-fd10b1b28c21 unbound from our chassis#033[00m
Jan 20 10:07:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:21.548 140354 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bac39b9-563a-456f-9168-fd10b1b28c21 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 20 10:07:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:07:21.549 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19b39b15-fb6b-474a-af9f-8ba08f911743]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.557 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588919 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 20 10:07:21 np0005588919 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a5.scope: Consumed 2.415s CPU time.
Jan 20 10:07:21 np0005588919 systemd-machined[194361]: Machine qemu-84-instance-000000a5 terminated.
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.594 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.594 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.595 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.595 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.595 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.595 225859 WARNING nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.596 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.596 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.596 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.596 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.596 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.597 225859 WARNING nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.597 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Received event network-vif-deleted-070862f1-1db2-45c2-9787-752e6d88449a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.597 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.597 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.598 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.598 225859 DEBUG oslo_concurrency.lockutils [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.598 225859 DEBUG nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.598 225859 WARNING nova.compute.manager [req-5c678cc2-b95b-481e-8aa8-5eee4aae19cc req-f53a18ec-c2a8-4bc0-889a-c18424f3d0fd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.638 225859 INFO nova.virt.libvirt.driver [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Instance destroyed successfully.#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.638 225859 DEBUG nova.objects.instance [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lazy-loading 'resources' on Instance uuid 5f8a2718-2106-431c-82c1-2609a52e7fb2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.670 225859 DEBUG nova.virt.libvirt.vif [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:06:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1315322326',display_name='tempest-ServerRescueTestJSON-server-1315322326',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1315322326',id=165,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:07:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e161d5a47f845fd89eb3f10627a0830',ramdisk_id='',reservation_id='r-g1s28307',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1151598672',owner_user_name='tempest-ServerRescueTestJSON-1151598672-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:07:20Z,user_data=None,user_id='a2beb3d6247e457abd6e8d93cc602f02',uuid=5f8a2718-2106-431c-82c1-2609a52e7fb2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.671 225859 DEBUG nova.network.os_vif_util [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converting VIF {"id": "6e7af943-7ef0-441d-a402-bd595082f98e", "address": "fa:16:3e:21:8b:e2", "network": {"id": "5bac39b9-563a-456f-9168-fd10b1b28c21", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-3053955-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "5e161d5a47f845fd89eb3f10627a0830", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e7af943-7e", "ovs_interfaceid": "6e7af943-7ef0-441d-a402-bd595082f98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.672 225859 DEBUG nova.network.os_vif_util [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.672 225859 DEBUG os_vif [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.674 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.674 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e7af943-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.676 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.677 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:21 np0005588919 nova_compute[225855]: 2026-01-20 15:07:21.680 225859 INFO os_vif [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8b:e2,bridge_name='br-int',has_traffic_filtering=True,id=6e7af943-7ef0-441d-a402-bd595082f98e,network=Network(5bac39b9-563a-456f-9168-fd10b1b28c21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e7af943-7e')#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.122 225859 INFO nova.virt.libvirt.driver [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Deleting instance files /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2_del#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.123 225859 INFO nova.virt.libvirt.driver [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Deletion of /var/lib/nova/instances/5f8a2718-2106-431c-82c1-2609a52e7fb2_del complete#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.194 225859 INFO nova.compute.manager [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Took 1.00 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.195 225859 DEBUG oslo.service.loopingcall [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.195 225859 DEBUG nova.compute.manager [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.196 225859 DEBUG nova.network.neutron [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:07:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:07:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 51K writes, 201K keys, 51K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.05 MB/s#012Cumulative WAL: 51K writes, 19K syncs, 2.68 writes per sync, written: 0.19 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8536 writes, 30K keys, 8536 commit groups, 1.0 writes per commit group, ingest: 29.31 MB, 0.05 MB/s#012Interval WAL: 8536 writes, 3570 syncs, 2.39 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:07:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:22.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.490 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:22.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:22 np0005588919 nova_compute[225855]: 2026-01-20 15:07:22.936 225859 DEBUG nova.network.neutron [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:07:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.146 225859 INFO nova.compute.manager [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Took 0.95 seconds to deallocate network for instance.#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.276 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.277 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.335 225859 DEBUG oslo_concurrency.processutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.715 225859 DEBUG nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.715 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.716 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.716 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.716 225859 DEBUG nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.717 225859 WARNING nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-unplugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.717 225859 DEBUG nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.717 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.717 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.718 225859 DEBUG oslo_concurrency.lockutils [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.718 225859 DEBUG nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] No waiting events found dispatching network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.718 225859 WARNING nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received unexpected event network-vif-plugged-6e7af943-7ef0-441d-a402-bd595082f98e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.719 225859 DEBUG nova.compute.manager [req-c51d29d2-74cf-4444-b21e-83faf4721dfc req-96aa0d8a-fe9a-4a00-af4a-e644b5ea9f60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Received event network-vif-deleted-6e7af943-7ef0-441d-a402-bd595082f98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1342745173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.786 225859 DEBUG oslo_concurrency.processutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.792 225859 DEBUG nova.compute.provider_tree [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.807 225859 DEBUG nova.scheduler.client.report [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.830 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.858 225859 INFO nova.scheduler.client.report [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Deleted allocations for instance 5f8a2718-2106-431c-82c1-2609a52e7fb2#033[00m
Jan 20 10:07:23 np0005588919 nova_compute[225855]: 2026-01-20 15:07:23.916 225859 DEBUG oslo_concurrency.lockutils [None req-8c89f417-ee5e-4e39-8750-d07a141e37ba a2beb3d6247e457abd6e8d93cc602f02 5e161d5a47f845fd89eb3f10627a0830 - - default default] Lock "5f8a2718-2106-431c-82c1-2609a52e7fb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:24 np0005588919 nova_compute[225855]: 2026-01-20 15:07:24.113 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921629.1123605, 5feeb9de-434b-4ec7-aa99-6da718514c6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:24 np0005588919 nova_compute[225855]: 2026-01-20 15:07:24.113 225859 INFO nova.compute.manager [-] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:07:24 np0005588919 nova_compute[225855]: 2026-01-20 15:07:24.141 225859 DEBUG nova.compute.manager [None req-6fd29b76-ca57-4c38-a508-7c49c42ac1ba - - - - - -] [instance: 5feeb9de-434b-4ec7-aa99-6da718514c6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:24.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:24.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 20 10:07:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:26.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:26 np0005588919 nova_compute[225855]: 2026-01-20 15:07:26.676 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:26.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:27 np0005588919 nova_compute[225855]: 2026-01-20 15:07:27.490 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:28.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:28.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:30 np0005588919 podman[293475]: 2026-01-20 15:07:30.045547087 +0000 UTC m=+0.084292503 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 20 10:07:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:30.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:30.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:31 np0005588919 nova_compute[225855]: 2026-01-20 15:07:31.709 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:32.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:32 np0005588919 nova_compute[225855]: 2026-01-20 15:07:32.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:32 np0005588919 nova_compute[225855]: 2026-01-20 15:07:32.570 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921637.5692608, 33ba7a73-3233-40a3-a49a-e5bbd604dc3c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:32 np0005588919 nova_compute[225855]: 2026-01-20 15:07:32.570 225859 INFO nova.compute.manager [-] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:07:32 np0005588919 nova_compute[225855]: 2026-01-20 15:07:32.593 225859 DEBUG nova.compute.manager [None req-0a64a240-fdde-4ce5-8bcd-4a6765e48bb8 - - - - - -] [instance: 33ba7a73-3233-40a3-a49a-e5bbd604dc3c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:32.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:33 np0005588919 nova_compute[225855]: 2026-01-20 15:07:33.386 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:07:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1685285972' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:07:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:34.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:34.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:36.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:36 np0005588919 nova_compute[225855]: 2026-01-20 15:07:36.636 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921641.635253, 5f8a2718-2106-431c-82c1-2609a52e7fb2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:07:36 np0005588919 nova_compute[225855]: 2026-01-20 15:07:36.636 225859 INFO nova.compute.manager [-] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:07:36 np0005588919 nova_compute[225855]: 2026-01-20 15:07:36.660 225859 DEBUG nova.compute.manager [None req-d30b17a3-b490-4e23-a6ca-ea8250130e4d - - - - - -] [instance: 5f8a2718-2106-431c-82c1-2609a52e7fb2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:07:36 np0005588919 nova_compute[225855]: 2026-01-20 15:07:36.712 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:36.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:37 np0005588919 nova_compute[225855]: 2026-01-20 15:07:37.492 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:38.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:38.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:40.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:40.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:41 np0005588919 nova_compute[225855]: 2026-01-20 15:07:41.714 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:42.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:42 np0005588919 nova_compute[225855]: 2026-01-20 15:07:42.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:42.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:44.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:07:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:44.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:07:45 np0005588919 podman[293558]: 2026-01-20 15:07:45.015772863 +0000 UTC m=+0.057437231 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:07:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:46.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:46 np0005588919 nova_compute[225855]: 2026-01-20 15:07:46.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:46.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:47 np0005588919 nova_compute[225855]: 2026-01-20 15:07:47.495 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:07:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2739921256' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:07:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:07:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2739921256' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:07:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:48.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:48.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:50 np0005588919 nova_compute[225855]: 2026-01-20 15:07:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:50 np0005588919 nova_compute[225855]: 2026-01-20 15:07:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:50 np0005588919 nova_compute[225855]: 2026-01-20 15:07:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:50 np0005588919 nova_compute[225855]: 2026-01-20 15:07:50.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:07:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:50.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:50.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:51 np0005588919 nova_compute[225855]: 2026-01-20 15:07:51.719 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:52 np0005588919 nova_compute[225855]: 2026-01-20 15:07:52.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:52 np0005588919 nova_compute[225855]: 2026-01-20 15:07:52.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:07:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:52 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:07:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:52 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:07:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:52.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:52 np0005588919 nova_compute[225855]: 2026-01-20 15:07:52.403 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:07:52 np0005588919 nova_compute[225855]: 2026-01-20 15:07:52.404 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:52 np0005588919 nova_compute[225855]: 2026-01-20 15:07:52.497 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:52.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:53 np0005588919 nova_compute[225855]: 2026-01-20 15:07:53.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:07:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:54.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:07:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:07:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 12K writes, 61K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 12K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1705 writes, 8440 keys, 1705 commit groups, 1.0 writes per commit group, ingest: 16.78 MB, 0.03 MB/s#012Interval WAL: 1705 writes, 1705 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     76.4      0.96              0.25        37    0.026       0      0       0.0       0.0#012  L6      1/0   11.07 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.7    102.8     87.1      3.95              1.11        36    0.110    236K    19K       0.0       0.0#012 Sum      1/0   11.07 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.7     82.7     85.0      4.91              1.36        73    0.067    236K    19K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.7     80.0     81.3      0.99              0.23        12    0.083     53K   3158       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0    102.8     87.1      3.95              1.11        36    0.110    236K    19K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     76.6      0.96              0.25        36    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.072, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.41 GB write, 0.10 MB/s write, 0.40 GB read, 0.10 MB/s read, 4.9 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 46.39 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000325 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2665,44.63 MB,14.6812%) FilterBlock(73,666.36 KB,0.21406%) IndexBlock(73,1.10 MB,0.363054%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:07:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:54.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:07:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:07:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:07:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:56.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.386 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.387 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.387 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.388 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:56.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.794 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.795 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.815 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:07:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:56 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1416750325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.863 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.902 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.903 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.910 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:07:56 np0005588919 nova_compute[225855]: 2026-01-20 15:07:56.911 225859 INFO nova.compute.claims [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.032 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.118 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.119 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4309MB free_disk=20.986278533935547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.120 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3044378009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.469 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.476 225859 DEBUG nova.compute.provider_tree [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.496 225859 DEBUG nova.scheduler.client.report [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.501 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.523 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.523 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.526 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.597 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.598 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.616 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance a25af5a3-096f-4363-842e-d960c22eb16b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.616 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.616 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.621 225859 INFO nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.641 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.671 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.738 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.740 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.740 225859 INFO nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Creating image(s)#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.765 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.789 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.818 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.822 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.859 225859 DEBUG nova.policy [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27658864f96d453586dd0846a4c55b7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc74c4a296554866969b05aef75252af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.886 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.887 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.887 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.888 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.911 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:07:57 np0005588919 nova_compute[225855]: 2026-01-20 15:07:57.915 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a25af5a3-096f-4363-842e-d960c22eb16b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:07:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:07:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4139490107' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:07:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.110 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.115 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.148 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.202 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.202 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.272 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a25af5a3-096f-4363-842e-d960c22eb16b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.343 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] resizing rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:07:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:07:58.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.490 225859 DEBUG nova.objects.instance [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.535 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.535 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Ensure instance console log exists: /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.536 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.536 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.536 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:07:58 np0005588919 nova_compute[225855]: 2026-01-20 15:07:58.638 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Successfully created port: 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:07:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:07:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:07:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:07:58.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.779 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Successfully updated port: 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.797 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.797 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.797 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.938 225859 DEBUG nova.compute.manager [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-changed-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.939 225859 DEBUG nova.compute.manager [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Refreshing instance network info cache due to event network-changed-6b7cb043-d1f4-4c2b-8173-1e3e2a664767. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:07:59 np0005588919 nova_compute[225855]: 2026-01-20 15:07:59.939 225859 DEBUG oslo_concurrency.lockutils [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:00 np0005588919 nova_compute[225855]: 2026-01-20 15:08:00.011 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:08:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:00.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:00.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.905579) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680905646, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 806, "num_deletes": 253, "total_data_size": 1391325, "memory_usage": 1409968, "flush_reason": "Manual Compaction"}
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680913783, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 917779, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 60707, "largest_seqno": 61507, "table_properties": {"data_size": 913954, "index_size": 1605, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8198, "raw_average_key_size": 17, "raw_value_size": 906097, "raw_average_value_size": 1982, "num_data_blocks": 70, "num_entries": 457, "num_filter_entries": 457, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921632, "oldest_key_time": 1768921632, "file_creation_time": 1768921680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 8236 microseconds, and 3329 cpu microseconds.
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.913826) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 917779 bytes OK
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.913845) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916406) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916420) EVENT_LOG_v1 {"time_micros": 1768921680916416, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916440) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 1387086, prev total WAL file size 1387086, number of live WAL files 2.
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916978) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323536' seq:72057594037927935, type:22 .. '6B7600353037' seq:0, type:0; will stop at (end)
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(896KB)], [120(11MB)]
Jan 20 10:08:00 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921680917035, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 12524116, "oldest_snapshot_seqno": -1}
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8469 keys, 11460969 bytes, temperature: kUnknown
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681048742, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11460969, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11405026, "index_size": 33687, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21189, "raw_key_size": 222677, "raw_average_key_size": 26, "raw_value_size": 11254639, "raw_average_value_size": 1328, "num_data_blocks": 1293, "num_entries": 8469, "num_filter_entries": 8469, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.049066) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11460969 bytes
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.051729) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 95.0 rd, 87.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.1 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(26.1) write-amplify(12.5) OK, records in: 8991, records dropped: 522 output_compression: NoCompression
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.051754) EVENT_LOG_v1 {"time_micros": 1768921681051743, "job": 76, "event": "compaction_finished", "compaction_time_micros": 131809, "compaction_time_cpu_micros": 29134, "output_level": 6, "num_output_files": 1, "total_output_size": 11460969, "num_input_records": 8991, "num_output_records": 8469, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681052613, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921681055050, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:00.916923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.055155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.055160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.055162) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.055163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:08:01.055165) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:08:01 np0005588919 podman[294000]: 2026-01-20 15:08:01.115866369 +0000 UTC m=+0.160747762 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.197 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.474 225859 DEBUG nova.network.neutron [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.496 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.497 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance network_info: |[{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.497 225859 DEBUG oslo_concurrency.lockutils [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.498 225859 DEBUG nova.network.neutron [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Refreshing network info cache for port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.501 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start _get_guest_xml network_info=[{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.506 225859 WARNING nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.510 225859 DEBUG nova.virt.libvirt.host [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.511 225859 DEBUG nova.virt.libvirt.host [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.513 225859 DEBUG nova.virt.libvirt.host [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.513 225859 DEBUG nova.virt.libvirt.host [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.515 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.515 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.515 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.516 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.516 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.516 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.517 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.517 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.517 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.517 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.518 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.518 225859 DEBUG nova.virt.hardware [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.521 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2382603208' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:01 np0005588919 nova_compute[225855]: 2026-01-20 15:08:01.992 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.017 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.020 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:02.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1789117815' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.435 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.438 225859 DEBUG nova.virt.libvirt.vif [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:07:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-950743647',display_name='tempest-ServerRescueNegativeTestJSON-server-950743647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-950743647',id=168,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-gkbc59sh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:07:57Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=a25af5a3-096f-4363-842e-d960c22eb16b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.439 225859 DEBUG nova.network.os_vif_util [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.440 225859 DEBUG nova.network.os_vif_util [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.442 225859 DEBUG nova.objects.instance [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.460 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <uuid>a25af5a3-096f-4363-842e-d960c22eb16b</uuid>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <name>instance-000000a8</name>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-950743647</nova:name>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:08:01</nova:creationTime>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <nova:port uuid="6b7cb043-d1f4-4c2b-8173-1e3e2a664767">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="serial">a25af5a3-096f-4363-842e-d960c22eb16b</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="uuid">a25af5a3-096f-4363-842e-d960c22eb16b</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a25af5a3-096f-4363-842e-d960c22eb16b_disk">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a25af5a3-096f-4363-842e-d960c22eb16b_disk.config">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:bf:9e:90"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <target dev="tap6b7cb043-d1"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/console.log" append="off"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:08:02 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:08:02 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:08:02 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:08:02 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.462 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Preparing to wait for external event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.462 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.462 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.462 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.463 225859 DEBUG nova.virt.libvirt.vif [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:07:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-950743647',display_name='tempest-ServerRescueNegativeTestJSON-server-950743647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-950743647',id=168,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-gkbc59sh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:07:57Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=a25af5a3-096f-4363-842e-d960c22eb16b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.463 225859 DEBUG nova.network.os_vif_util [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.464 225859 DEBUG nova.network.os_vif_util [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.464 225859 DEBUG os_vif [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.465 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.466 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.466 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.469 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.469 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b7cb043-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.469 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b7cb043-d1, col_values=(('external_ids', {'iface-id': '6b7cb043-d1f4-4c2b-8173-1e3e2a664767', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:9e:90', 'vm-uuid': 'a25af5a3-096f-4363-842e-d960c22eb16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588919 NetworkManager[49104]: <info>  [1768921682.4725] manager: (tap6b7cb043-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.477 225859 INFO os_vif [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1')#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.502 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.684 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.684 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.685 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:bf:9e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.685 225859 INFO nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Using config drive#033[00m
Jan 20 10:08:02 np0005588919 nova_compute[225855]: 2026-01-20 15:08:02.714 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.307 225859 DEBUG nova.network.neutron [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updated VIF entry in instance network info cache for port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.308 225859 DEBUG nova.network.neutron [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.323 225859 DEBUG oslo_concurrency.lockutils [req-2bd30d7c-ea96-47b7-a938-f9edecd4020b req-b9f3137b-5f1e-4198-9025-50f15059a5f5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.400 225859 INFO nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Creating config drive at /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.405 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb8fhnp_o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.539 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb8fhnp_o" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.568 225859 DEBUG nova.storage.rbd_utils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.572 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config a25af5a3-096f-4363-842e-d960c22eb16b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.737 225859 DEBUG oslo_concurrency.processutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config a25af5a3-096f-4363-842e-d960c22eb16b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.738 225859 INFO nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Deleting local config drive /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config because it was imported into RBD.#033[00m
Jan 20 10:08:03 np0005588919 kernel: tap6b7cb043-d1: entered promiscuous mode
Jan 20 10:08:03 np0005588919 NetworkManager[49104]: <info>  [1768921683.8021] manager: (tap6b7cb043-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Jan 20 10:08:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:03Z|00724|binding|INFO|Claiming lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for this chassis.
Jan 20 10:08:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:03Z|00725|binding|INFO|6b7cb043-d1f4-4c2b-8173-1e3e2a664767: Claiming fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.804 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.808 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.818 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:9e:90 10.100.0.6'], port_security=['fa:16:3e:bf:9e:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a25af5a3-096f-4363-842e-d960c22eb16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6b7cb043-d1f4-4c2b-8173-1e3e2a664767) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.819 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.821 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:08:03 np0005588919 systemd-udevd[294213]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.836 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2503d68-3509-46ed-9fab-61f3c0bc1ac2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.837 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3967ae21-11 in ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.840 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3967ae21-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.840 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8f43d2-2f9e-444b-b028-a10022d38135]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.841 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[231d3e0d-93ec-4550-8f18-606de4ef5f8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 systemd-machined[194361]: New machine qemu-85-instance-000000a8.
Jan 20 10:08:03 np0005588919 NetworkManager[49104]: <info>  [1768921683.8562] device (tap6b7cb043-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:03 np0005588919 NetworkManager[49104]: <info>  [1768921683.8571] device (tap6b7cb043-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.859 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[200b1161-c057-4836-8a51-534a16b189c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:03Z|00726|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 ovn-installed in OVS
Jan 20 10:08:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:03Z|00727|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 up in Southbound
Jan 20 10:08:03 np0005588919 nova_compute[225855]: 2026-01-20 15:08:03.883 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.885 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a472e7b4-8ea1-434f-a24a-da77d8fa5bfd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 systemd[1]: Started Virtual Machine qemu-85-instance-000000a8.
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.916 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf7319b-a156-46ee-ab0e-b651da7fa01d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.922 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[88e61a42-0446-4096-8b2c-93ca07eccb22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 NetworkManager[49104]: <info>  [1768921683.9247] manager: (tap3967ae21-10): new Veth device (/org/freedesktop/NetworkManager/Devices/302)
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.963 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4c6e0ae2-da1b-4498-9675-1768dc55fe4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:03.968 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c4272311-5fbe-4b4b-8eb3-a2b2f98732dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:03 np0005588919 NetworkManager[49104]: <info>  [1768921683.9974] device (tap3967ae21-10): carrier: link connected
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.002 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8aaabd28-1334-4f21-be52-025a92ef38e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff60764-19a4-426b-9e85-fbd21e67f8b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672292, 'reachable_time': 37097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294246, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.038 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a99505-6f9c-43db-9b34-bbc91a9f03d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:ce9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672292, 'tstamp': 672292}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294247, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.055 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[03b564e0-8709-4699-b430-717c655cce82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672292, 'reachable_time': 37097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294248, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.086 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cfaf8a44-3908-414b-87b1-e527683fafcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.148 225859 DEBUG nova.compute.manager [req-e2f2f4d3-a6c4-4690-91b1-7945d9fc0659 req-4bbf4d54-2ace-4b98-a622-7b31886b23f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.149 225859 DEBUG oslo_concurrency.lockutils [req-e2f2f4d3-a6c4-4690-91b1-7945d9fc0659 req-4bbf4d54-2ace-4b98-a622-7b31886b23f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.150 225859 DEBUG oslo_concurrency.lockutils [req-e2f2f4d3-a6c4-4690-91b1-7945d9fc0659 req-4bbf4d54-2ace-4b98-a622-7b31886b23f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.150 225859 DEBUG oslo_concurrency.lockutils [req-e2f2f4d3-a6c4-4690-91b1-7945d9fc0659 req-4bbf4d54-2ace-4b98-a622-7b31886b23f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.150 225859 DEBUG nova.compute.manager [req-e2f2f4d3-a6c4-4690-91b1-7945d9fc0659 req-4bbf4d54-2ace-4b98-a622-7b31886b23f4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Processing event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.158 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c4efaf49-ad37-433b-bec7-845453232ba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.160 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.161 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.162 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:04 np0005588919 kernel: tap3967ae21-10: entered promiscuous mode
Jan 20 10:08:04 np0005588919 NetworkManager[49104]: <info>  [1768921684.1659] manager: (tap3967ae21-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.167 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.168 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:04Z|00728|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.184 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.185 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9e65d0e5-1769-4dc9-8d0b-17e1b0d62a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.186 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:04.187 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'env', 'PROCESS_TAG=haproxy-3967ae21-1590-4685-8881-8bd1bcf25258', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3967ae21-1590-4685-8881-8bd1bcf25258.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.308 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921684.308199, a25af5a3-096f-4363-842e-d960c22eb16b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.309 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.312 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.315 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.319 225859 INFO nova.virt.libvirt.driver [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance spawned successfully.#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.320 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.337 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.344 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.345 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.346 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.346 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.347 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.348 225859 DEBUG nova.virt.libvirt.driver [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.351 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:04.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.398 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.399 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921684.309127, a25af5a3-096f-4363-842e-d960c22eb16b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.399 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.422 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.426 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921684.3151796, a25af5a3-096f-4363-842e-d960c22eb16b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.427 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.434 225859 INFO nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Took 6.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.435 225859 DEBUG nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.462 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.465 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.491 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:08:04 np0005588919 podman[294323]: 2026-01-20 15:08:04.604289848 +0000 UTC m=+0.058124671 container create 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:08:04 np0005588919 systemd[1]: Started libpod-conmon-619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306.scope.
Jan 20 10:08:04 np0005588919 podman[294323]: 2026-01-20 15:08:04.571839467 +0000 UTC m=+0.025674320 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:08:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e15a8fbf0d3cd1123225aae1fd10c9e39fdb8283817ce9b1cbec15659d2486e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:04 np0005588919 podman[294323]: 2026-01-20 15:08:04.708968988 +0000 UTC m=+0.162803831 container init 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.710 225859 INFO nova.compute.manager [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Took 7.84 seconds to build instance.#033[00m
Jan 20 10:08:04 np0005588919 podman[294323]: 2026-01-20 15:08:04.714493515 +0000 UTC m=+0.168328338 container start 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:08:04 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [NOTICE]   (294343) : New worker (294345) forked
Jan 20 10:08:04 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [NOTICE]   (294343) : Loading success.
Jan 20 10:08:04 np0005588919 nova_compute[225855]: 2026-01-20 15:08:04.758 225859 DEBUG oslo_concurrency.lockutils [None req-1b5afbb8-83be-4882-a4ba-3dcc1a91e616 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:04.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.271 225859 DEBUG nova.compute.manager [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.272 225859 DEBUG oslo_concurrency.lockutils [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.272 225859 DEBUG oslo_concurrency.lockutils [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.273 225859 DEBUG oslo_concurrency.lockutils [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.273 225859 DEBUG nova.compute.manager [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.273 225859 WARNING nova.compute.manager [req-d7f389f7-bdda-488c-88a6-9c510ef40a2a req-8e289a31-1d18-4a10-ac43-aad26c86bae6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:06.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.755 225859 INFO nova.compute.manager [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Rescuing#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.756 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.756 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquired lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:06 np0005588919 nova_compute[225855]: 2026-01-20 15:08:06.756 225859 DEBUG nova.network.neutron [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:08:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:06.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:07 np0005588919 nova_compute[225855]: 2026-01-20 15:08:07.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:07 np0005588919 nova_compute[225855]: 2026-01-20 15:08:07.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:08.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:08.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:10.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:10.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:10 np0005588919 nova_compute[225855]: 2026-01-20 15:08:10.962 225859 DEBUG nova.network.neutron [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:10 np0005588919 nova_compute[225855]: 2026-01-20 15:08:10.983 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Releasing lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:11 np0005588919 nova_compute[225855]: 2026-01-20 15:08:11.343 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:08:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:12.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:12 np0005588919 nova_compute[225855]: 2026-01-20 15:08:12.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:12 np0005588919 nova_compute[225855]: 2026-01-20 15:08:12.506 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:12.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3538750701' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3538750701' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:08:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 20 10:08:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:14.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 20 10:08:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:14.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 20 10:08:16 np0005588919 podman[294360]: 2026-01-20 15:08:16.025347333 +0000 UTC m=+0.068432712 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 20 10:08:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:16.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:16.427 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:16.427 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:16.428 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 20 10:08:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:16.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:16Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:16Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:17 np0005588919 nova_compute[225855]: 2026-01-20 15:08:17.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:17 np0005588919 nova_compute[225855]: 2026-01-20 15:08:17.507 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e374 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:18.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:18.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2580296979' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:19.575 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:19 np0005588919 nova_compute[225855]: 2026-01-20 15:08:19.575 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:19.576 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:08:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:20.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 20 10:08:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:20.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 20 10:08:21 np0005588919 nova_compute[225855]: 2026-01-20 15:08:21.389 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:08:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:22.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:22 np0005588919 nova_compute[225855]: 2026-01-20 15:08:22.483 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:22 np0005588919 nova_compute[225855]: 2026-01-20 15:08:22.511 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:22.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.578 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:23 np0005588919 kernel: tap6b7cb043-d1 (unregistering): left promiscuous mode
Jan 20 10:08:23 np0005588919 NetworkManager[49104]: <info>  [1768921703.8524] device (tap6b7cb043-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:08:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:23Z|00729|binding|INFO|Releasing lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 from this chassis (sb_readonly=0)
Jan 20 10:08:23 np0005588919 nova_compute[225855]: 2026-01-20 15:08:23.857 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:23Z|00730|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 down in Southbound
Jan 20 10:08:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:23Z|00731|binding|INFO|Removing iface tap6b7cb043-d1 ovn-installed in OVS
Jan 20 10:08:23 np0005588919 nova_compute[225855]: 2026-01-20 15:08:23.859 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.868 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:9e:90 10.100.0.6'], port_security=['fa:16:3e:bf:9e:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a25af5a3-096f-4363-842e-d960c22eb16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6b7cb043-d1f4-4c2b-8173-1e3e2a664767) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.869 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.870 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3967ae21-1590-4685-8881-8bd1bcf25258, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.871 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f0748019-1163-4337-996a-1e5001bed076]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:23.872 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace which is not needed anymore#033[00m
Jan 20 10:08:23 np0005588919 nova_compute[225855]: 2026-01-20 15:08:23.891 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:23 np0005588919 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Jan 20 10:08:23 np0005588919 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a8.scope: Consumed 13.379s CPU time.
Jan 20 10:08:23 np0005588919 systemd-machined[194361]: Machine qemu-85-instance-000000a8 terminated.
Jan 20 10:08:24 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [NOTICE]   (294343) : haproxy version is 2.8.14-c23fe91
Jan 20 10:08:24 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [NOTICE]   (294343) : path to executable is /usr/sbin/haproxy
Jan 20 10:08:24 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [WARNING]  (294343) : Exiting Master process...
Jan 20 10:08:24 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [ALERT]    (294343) : Current worker (294345) exited with code 143 (Terminated)
Jan 20 10:08:24 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294339]: [WARNING]  (294343) : All workers exited. Exiting... (0)
Jan 20 10:08:24 np0005588919 systemd[1]: libpod-619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306.scope: Deactivated successfully.
Jan 20 10:08:24 np0005588919 podman[294458]: 2026-01-20 15:08:24.036784945 +0000 UTC m=+0.064545122 container died 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:08:24 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306-userdata-shm.mount: Deactivated successfully.
Jan 20 10:08:24 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e15a8fbf0d3cd1123225aae1fd10c9e39fdb8283817ce9b1cbec15659d2486e8-merged.mount: Deactivated successfully.
Jan 20 10:08:24 np0005588919 podman[294458]: 2026-01-20 15:08:24.076014348 +0000 UTC m=+0.103774535 container cleanup 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:08:24 np0005588919 systemd[1]: libpod-conmon-619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306.scope: Deactivated successfully.
Jan 20 10:08:24 np0005588919 podman[294491]: 2026-01-20 15:08:24.137638457 +0000 UTC m=+0.038323899 container remove 619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.143 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7db549f5-225d-4ead-97d6-7d3ddcf1da59]: (4, ('Tue Jan 20 03:08:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306)\n619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306\nTue Jan 20 03:08:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306)\n619281131e0bd7b526e0f1262ee854d58e71ee2663a61f0675cef0cbbbf34306\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.145 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa1970d-2fc8-4d83-b019-fb1e1b1eb125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.147 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.149 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:24 np0005588919 kernel: tap3967ae21-10: left promiscuous mode
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.171 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.175 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ac8e78-080f-4f28-b991-2708d08c3368]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.194 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[28d9d163-9355-4fed-8b54-083b7cd0afc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.195 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dab90d20-0ecb-4f0b-b944-815e17a89c35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.210 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9960d6c2-a2d4-46e0-ba73-723f993ee9f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672283, 'reachable_time': 33815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294517, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.213 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:08:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:24.213 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ec6393-d94f-49de-ae27-d5016da24abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:24 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3967ae21\x2d1590\x2d4685\x2d8881\x2d8bd1bcf25258.mount: Deactivated successfully.
Jan 20 10:08:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:24.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.399 225859 DEBUG nova.compute.manager [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.399 225859 DEBUG oslo_concurrency.lockutils [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.400 225859 DEBUG oslo_concurrency.lockutils [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.400 225859 DEBUG oslo_concurrency.lockutils [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.400 225859 DEBUG nova.compute.manager [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.400 225859 WARNING nova.compute.manager [req-9b156bec-7cc6-4bcc-a6c1-ba8ede0b3547 req-c0db19e3-0daf-485b-9c4b-8da0f3dd84de 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.404 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.409 225859 INFO nova.virt.libvirt.driver [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance destroyed successfully.#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.410 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'numa_topology' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.428 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Attempting rescue#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.429 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.432 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.433 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Creating image(s)#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.456 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.459 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'trusted_certs' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.491 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.516 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.518 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.580 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.580 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.581 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.581 225859 DEBUG oslo_concurrency.lockutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.606 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.609 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:24.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.891 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.282s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.893 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'migration_context' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.912 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.913 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start _get_guest_xml network_info=[{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:bf:9e:90"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.914 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.936 225859 WARNING nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.947 225859 DEBUG nova.virt.libvirt.host [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.948 225859 DEBUG nova.virt.libvirt.host [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.951 225859 DEBUG nova.virt.libvirt.host [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.952 225859 DEBUG nova.virt.libvirt.host [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.953 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.953 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.953 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.954 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.954 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.954 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.954 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.954 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.955 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.955 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.955 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.955 225859 DEBUG nova.virt.hardware [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.956 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'vcpu_model' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:24 np0005588919 nova_compute[225855]: 2026-01-20 15:08:24.979 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2739269448' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:25 np0005588919 nova_compute[225855]: 2026-01-20 15:08:25.422 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:25 np0005588919 nova_compute[225855]: 2026-01-20 15:08:25.423 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2989555521' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:25 np0005588919 nova_compute[225855]: 2026-01-20 15:08:25.874 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:25 np0005588919 nova_compute[225855]: 2026-01-20 15:08:25.876 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2370305360' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.306 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.308 225859 DEBUG nova.virt.libvirt.vif [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:07:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-950743647',display_name='tempest-ServerRescueNegativeTestJSON-server-950743647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-950743647',id=168,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-gkbc59sh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:08:04Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=a25af5a3-096f-4363-842e-d960c22eb16b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:bf:9e:90"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.308 225859 DEBUG nova.network.os_vif_util [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "vif_mac": "fa:16:3e:bf:9e:90"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.310 225859 DEBUG nova.network.os_vif_util [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.311 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'pci_devices' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.340 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <uuid>a25af5a3-096f-4363-842e-d960c22eb16b</uuid>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <name>instance-000000a8</name>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-950743647</nova:name>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:08:24</nova:creationTime>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:user uuid="27658864f96d453586dd0846a4c55b7d">tempest-ServerRescueNegativeTestJSON-1649662639-project-member</nova:user>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:project uuid="fc74c4a296554866969b05aef75252af">tempest-ServerRescueNegativeTestJSON-1649662639</nova:project>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <nova:port uuid="6b7cb043-d1f4-4c2b-8173-1e3e2a664767">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="serial">a25af5a3-096f-4363-842e-d960c22eb16b</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="uuid">a25af5a3-096f-4363-842e-d960c22eb16b</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a25af5a3-096f-4363-842e-d960c22eb16b_disk.rescue">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a25af5a3-096f-4363-842e-d960c22eb16b_disk">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/a25af5a3-096f-4363-842e-d960c22eb16b_disk.config.rescue">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:bf:9e:90"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <target dev="tap6b7cb043-d1"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/console.log" append="off"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:08:26 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:08:26 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:08:26 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:08:26 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.348 225859 INFO nova.virt.libvirt.driver [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance destroyed successfully.#033[00m
Jan 20 10:08:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:26.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.406 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.406 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.407 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.407 225859 DEBUG nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] No VIF found with MAC fa:16:3e:bf:9e:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.407 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Using config drive#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.430 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.459 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'ec2_ids' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.498 225859 DEBUG nova.objects.instance [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'keypairs' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.524 225859 DEBUG nova.compute.manager [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.525 225859 DEBUG oslo_concurrency.lockutils [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.525 225859 DEBUG oslo_concurrency.lockutils [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.526 225859 DEBUG oslo_concurrency.lockutils [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.526 225859 DEBUG nova.compute.manager [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.526 225859 WARNING nova.compute.manager [req-f64749e5-2344-4374-bf6a-e7951e9c5a25 req-cdf39710-058d-43c0-9806-447cc0e8a76d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state active and task_state rescuing.#033[00m
Jan 20 10:08:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:26.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.884 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Creating config drive at /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue#033[00m
Jan 20 10:08:26 np0005588919 nova_compute[225855]: 2026-01-20 15:08:26.889 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1bo1ol_4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.019 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1bo1ol_4" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.046 225859 DEBUG nova.storage.rbd_utils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] rbd image a25af5a3-096f-4363-842e-d960c22eb16b_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.050 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue a25af5a3-096f-4363-842e-d960c22eb16b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.222 225859 DEBUG oslo_concurrency.processutils [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue a25af5a3-096f-4363-842e-d960c22eb16b_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.223 225859 INFO nova.virt.libvirt.driver [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Deleting local config drive /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b/disk.config.rescue because it was imported into RBD.#033[00m
Jan 20 10:08:27 np0005588919 kernel: tap6b7cb043-d1: entered promiscuous mode
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.2933] manager: (tap6b7cb043-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/304)
Jan 20 10:08:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:27Z|00732|binding|INFO|Claiming lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for this chassis.
Jan 20 10:08:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:27Z|00733|binding|INFO|6b7cb043-d1f4-4c2b-8173-1e3e2a664767: Claiming fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.294 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.306 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:9e:90 10.100.0.6'], port_security=['fa:16:3e:bf:9e:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a25af5a3-096f-4363-842e-d960c22eb16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6b7cb043-d1f4-4c2b-8173-1e3e2a664767) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.307 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 bound to our chassis#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.309 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3967ae21-1590-4685-8881-8bd1bcf25258#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.311 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:27Z|00734|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 ovn-installed in OVS
Jan 20 10:08:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:27Z|00735|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 up in Southbound
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.314 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.316 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.323 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[636a2d9a-3c64-4fd8-92fb-8fd3c00dabd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.323 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3967ae21-11 in ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.326 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3967ae21-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.326 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9eaeff23-291f-48e8-ad3f-5f6ef8b754e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.327 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf328fc6-5972-49cd-a03f-09af5c767bab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 systemd-machined[194361]: New machine qemu-86-instance-000000a8.
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.338 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9e8c40-72c4-4418-89ca-310a3599ec67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.350 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d3693682-0966-4530-8616-0fe674d698b8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 systemd[1]: Started Virtual Machine qemu-86-instance-000000a8.
Jan 20 10:08:27 np0005588919 systemd-udevd[294755]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.3795] device (tap6b7cb043-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.3801] device (tap6b7cb043-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.384 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6d01f1ce-100c-40e4-a212-e9c28dbbcb7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.388 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9688413f-fbdd-4a1a-8b35-0a98fa1f668b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 systemd-udevd[294758]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.3913] manager: (tap3967ae21-10): new Veth device (/org/freedesktop/NetworkManager/Devices/305)
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.418 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef7522e-3659-4ff1-9777-c948e36e5f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.421 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a7ea9cee-e120-4570-b03d-b801b91874d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.4452] device (tap3967ae21-10): carrier: link connected
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.450 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3adbe640-5899-4572-9ca9-833af711cd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.469 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7024665a-e84e-42b2-8456-bc911d2039d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674637, 'reachable_time': 39267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294784, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.486 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[de93a492-c398-4b60-a884-d5a6dae16397]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:ce9d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 674637, 'tstamp': 674637}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294785, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.505 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0b019050-3ddf-4bf8-b7f0-bfd4b905d671]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3967ae21-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:ce:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 209], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674637, 'reachable_time': 39267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294786, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.514 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.532 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f90ecff0-75a9-46c9-b65b-c0863b4ba005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.596 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4a602c3b-a586-4f3b-bd65-46f4c3c12011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.597 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.597 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.598 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3967ae21-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.599 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 kernel: tap3967ae21-10: entered promiscuous mode
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 NetworkManager[49104]: <info>  [1768921707.6018] manager: (tap3967ae21-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.602 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3967ae21-10, col_values=(('external_ids', {'iface-id': 'b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.603 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.605 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.605 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:27Z|00736|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.606 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0d9c1ed5-1466-4397-bdaa-935e623a21e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.607 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3967ae21-1590-4685-8881-8bd1bcf25258.pid.haproxy
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3967ae21-1590-4685-8881-8bd1bcf25258
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:27.607 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'env', 'PROCESS_TAG=haproxy-3967ae21-1590-4685-8881-8bd1bcf25258', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3967ae21-1590-4685-8881-8bd1bcf25258.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:27 np0005588919 nova_compute[225855]: 2026-01-20 15:08:27.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:27 np0005588919 podman[294819]: 2026-01-20 15:08:27.94658885 +0000 UTC m=+0.042929939 container create d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:08:27 np0005588919 systemd[1]: Started libpod-conmon-d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e.scope.
Jan 20 10:08:28 np0005588919 podman[294819]: 2026-01-20 15:08:27.925596504 +0000 UTC m=+0.021937613 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:28 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:08:28 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a0a29c0cfe0897e5a9faa037b73161c66d2ed59c5ab8259820c45f49ae9bf20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:28 np0005588919 podman[294819]: 2026-01-20 15:08:28.041195974 +0000 UTC m=+0.137537083 container init d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:08:28 np0005588919 podman[294819]: 2026-01-20 15:08:28.046287259 +0000 UTC m=+0.142628348 container start d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:08:28 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [NOTICE]   (294872) : New worker (294884) forked
Jan 20 10:08:28 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [NOTICE]   (294872) : Loading success.
Jan 20 10:08:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e376 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.220 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for a25af5a3-096f-4363-842e-d960c22eb16b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.221 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921708.220054, a25af5a3-096f-4363-842e-d960c22eb16b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.222 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.228 225859 DEBUG nova.compute.manager [None req-cbffac24-6e30-414d-95cc-0bbdcd9d1b37 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.277 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.281 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.314 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921708.2202642, a25af5a3-096f-4363-842e-d960c22eb16b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.314 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.335 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.338 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:28.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.683 225859 DEBUG nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.683 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.684 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.684 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.684 225859 DEBUG nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.684 225859 WARNING nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.685 225859 DEBUG nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.685 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.685 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.685 225859 DEBUG oslo_concurrency.lockutils [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.685 225859 DEBUG nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:28 np0005588919 nova_compute[225855]: 2026-01-20 15:08:28.686 225859 WARNING nova.compute.manager [req-af1bd6c5-6428-42f6-bdeb-491e8ece872c req-8c49eeeb-a2cb-4a88-95f3-640fb24c01cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state rescued and task_state None.#033[00m
Jan 20 10:08:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:30.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 20 10:08:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:32 np0005588919 podman[294912]: 2026-01-20 15:08:32.084927509 +0000 UTC m=+0.117567667 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:08:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:32.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:32 np0005588919 nova_compute[225855]: 2026-01-20 15:08:32.487 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:32 np0005588919 nova_compute[225855]: 2026-01-20 15:08:32.517 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:32.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.618 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.619 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.637 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.726 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.727 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.734 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.735 225859 INFO nova.compute.claims [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:08:33 np0005588919 nova_compute[225855]: 2026-01-20 15:08:33.886 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:08:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/982314999' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.315 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.321 225859 DEBUG nova.compute.provider_tree [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.343 225859 DEBUG nova.scheduler.client.report [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.379 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.380 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:08:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:34.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:34.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.943 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.944 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:08:34 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.972 225859 INFO nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:34.999 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.092 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.095 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.095 225859 INFO nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Creating image(s)#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.125 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.159 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.195 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.202 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "a522636f3423dd1eea3b834dfd08917146e09c47" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.203 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "a522636f3423dd1eea3b834dfd08917146e09c47" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.219 225859 DEBUG nova.policy [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c98bd3f0904e48efa524d598bcad85e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b43342be22543f79d4a56e26c6d0c96', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:08:35 np0005588919 nova_compute[225855]: 2026-01-20 15:08:35.514 225859 DEBUG nova.virt.libvirt.imagebackend [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Image locations are: [{'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/5b64c953-6df3-45a3-ae28-e419ba117bb2/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e399cf45-e6b6-5393-99f1-75c601d3f188/images/5b64c953-6df3-45a3-ae28-e419ba117bb2/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.109 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Successfully created port: c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:08:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:36.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.623 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.705 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.part --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.707 225859 DEBUG nova.virt.images [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] 5b64c953-6df3-45a3-ae28-e419ba117bb2 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.708 225859 DEBUG nova.privsep.utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.709 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.part /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.923 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.part /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.converted" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.927 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.993 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:36 np0005588919 nova_compute[225855]: 2026-01-20 15:08:36.997 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "a522636f3423dd1eea3b834dfd08917146e09c47" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.030 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.034 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47 b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.116 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Successfully updated port: c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.133 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.134 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.134 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.259 225859 DEBUG nova.compute.manager [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.262 225859 DEBUG nova.compute.manager [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.262 225859 DEBUG oslo_concurrency.lockutils [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.316 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47 b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.372 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] resizing rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.407 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.470 225859 DEBUG nova.objects.instance [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'migration_context' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.490 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.492 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.492 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Ensure instance console log exists: /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.493 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.493 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.494 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:37 np0005588919 nova_compute[225855]: 2026-01-20 15:08:37.518 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.394 225859 DEBUG nova.network.neutron [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:38.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.419 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.420 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance network_info: |[{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.421 225859 DEBUG oslo_concurrency.lockutils [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.421 225859 DEBUG nova.network.neutron [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.424 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start _get_guest_xml network_info=[{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T15:08:29Z,direct_url=<?>,disk_format='qcow2',id=5b64c953-6df3-45a3-ae28-e419ba117bb2,min_disk=0,min_ram=0,name='tempest-scenario-img--310583103',owner='5b43342be22543f79d4a56e26c6d0c96',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T15:08:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '5b64c953-6df3-45a3-ae28-e419ba117bb2'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.429 225859 WARNING nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.434 225859 DEBUG nova.virt.libvirt.host [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.434 225859 DEBUG nova.virt.libvirt.host [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.437 225859 DEBUG nova.virt.libvirt.host [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.437 225859 DEBUG nova.virt.libvirt.host [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.438 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.439 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T15:08:29Z,direct_url=<?>,disk_format='qcow2',id=5b64c953-6df3-45a3-ae28-e419ba117bb2,min_disk=0,min_ram=0,name='tempest-scenario-img--310583103',owner='5b43342be22543f79d4a56e26c6d0c96',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T15:08:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.439 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.439 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.440 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.440 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.440 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.440 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.441 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.441 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.441 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.442 225859 DEBUG nova.virt.hardware [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.445 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:38.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/571252648' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.888 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.919 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:38 np0005588919 nova_compute[225855]: 2026-01-20 15:08:38.925 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:08:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/496706624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.400 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.403 225859 DEBUG nova.virt.libvirt.vif [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:08:35Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.403 225859 DEBUG nova.network.os_vif_util [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.404 225859 DEBUG nova.network.os_vif_util [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.406 225859 DEBUG nova.objects.instance [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.422 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <uuid>b4c55640-85f9-4d75-a4df-6ee77b21ca73</uuid>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <name>instance-000000aa</name>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestMinimumBasicScenario-server-2033880413</nova:name>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:08:38</nova:creationTime>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:user uuid="c98bd3f0904e48efa524d598bcad85e9">tempest-TestMinimumBasicScenario-1665080150-project-member</nova:user>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:project uuid="5b43342be22543f79d4a56e26c6d0c96">tempest-TestMinimumBasicScenario-1665080150</nova:project>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="5b64c953-6df3-45a3-ae28-e419ba117bb2"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <nova:port uuid="c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="serial">b4c55640-85f9-4d75-a4df-6ee77b21ca73</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="uuid">b4c55640-85f9-4d75-a4df-6ee77b21ca73</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:7f:85:09"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <target dev="tapc6bf5189-ce"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/console.log" append="off"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:08:39 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:08:39 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:08:39 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:08:39 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.423 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Preparing to wait for external event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.424 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.424 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.424 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.425 225859 DEBUG nova.virt.libvirt.vif [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:08:35Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.425 225859 DEBUG nova.network.os_vif_util [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.426 225859 DEBUG nova.network.os_vif_util [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.426 225859 DEBUG os_vif [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.427 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.427 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.428 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.432 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6bf5189-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.434 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6bf5189-ce, col_values=(('external_ids', {'iface-id': 'c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:85:09', 'vm-uuid': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:39 np0005588919 NetworkManager[49104]: <info>  [1768921719.4375] manager: (tapc6bf5189-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.438 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.445 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.446 225859 INFO os_vif [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce')#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.505 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.505 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.506 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No VIF found with MAC fa:16:3e:7f:85:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.506 225859 INFO nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Using config drive#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.537 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.682 225859 DEBUG nova.network.neutron [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.683 225859 DEBUG nova.network.neutron [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.698 225859 DEBUG oslo_concurrency.lockutils [req-c895dafb-aad6-4954-aec1-9a04e26b7a68 req-639877df-4f50-42c7-8a54-1b53b4583caf 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.827 225859 INFO nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Creating config drive at /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.832 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03eyzntx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:39 np0005588919 nova_compute[225855]: 2026-01-20 15:08:39.965 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03eyzntx" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.066 225859 DEBUG nova.storage.rbd_utils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] rbd image b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.073 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:40.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.445 225859 DEBUG oslo_concurrency.processutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.373s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.446 225859 INFO nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Deleting local config drive /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/disk.config because it was imported into RBD.#033[00m
Jan 20 10:08:40 np0005588919 kernel: tapc6bf5189-ce: entered promiscuous mode
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.4941] manager: (tapc6bf5189-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/308)
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00737|binding|INFO|Claiming lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for this chassis.
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00738|binding|INFO|c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf: Claiming fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.496 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.507 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.509 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b bound to our chassis#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.512 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e22d6ddc-0339-4395-bc21-95081825f05b#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.525 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d1531949-8cd0-4651-9524-af1a36fc2d55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.526 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape22d6ddc-01 in ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.529 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape22d6ddc-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.529 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1b4b02-df86-406d-805f-35a694138dbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 systemd-udevd[295331]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.530 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c034e42c-34bd-4ba4-9e1c-5cf908db42ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 systemd-machined[194361]: New machine qemu-87-instance-000000aa.
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.544 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[86abc2db-cb49-49f5-acb8-9a1e7fee4815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.5476] device (tapc6bf5189-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.5485] device (tapc6bf5189-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.567 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 systemd[1]: Started Virtual Machine qemu-87-instance-000000aa.
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00739|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf ovn-installed in OVS
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00740|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf up in Southbound
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.575 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[00883eb0-1154-4453-97e8-6bbe64fc72a4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.603 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[99e03192-9728-47e7-ba68-131c81f1d285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.612 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb305f1-1b83-49d6-97ce-420821571bba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.6135] manager: (tape22d6ddc-00): new Veth device (/org/freedesktop/NetworkManager/Devices/309)
Jan 20 10:08:40 np0005588919 systemd-udevd[295334]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.650 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[53953e07-37b8-472f-ad17-0e2c239f95c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.654 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[26695018-1ff0-44d3-88e0-c35f1ee09af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.6821] device (tape22d6ddc-00): carrier: link connected
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.688 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[47573274-576e-44ea-b9b8-6edb69d2c21e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.704 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3c838d-9683-475f-9b43-3c0523ae2a84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape22d6ddc-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:3f:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675960, 'reachable_time': 35650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295363, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.720 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c3fb7b-fb18-4834-a1ad-8e298f68d8e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:3f5c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675960, 'tstamp': 675960}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295364, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.740 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b29715c6-3bfe-4c74-8d09-781a8a097fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape22d6ddc-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:3f:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675960, 'reachable_time': 35650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295365, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.771 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[899bd977-684b-4cc0-874e-32e4328a12a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:40.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:9e:90 10.100.0.6
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.824 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2942a4d6-803d-4ba9-973c-4a016389092d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.826 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape22d6ddc-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.826 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.827 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape22d6ddc-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.828 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 NetworkManager[49104]: <info>  [1768921720.8292] manager: (tape22d6ddc-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 20 10:08:40 np0005588919 kernel: tape22d6ddc-00: entered promiscuous mode
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.832 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape22d6ddc-00, col_values=(('external_ids', {'iface-id': '940a1442-b0ab-49a2-87e8-750659cdda8d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:08:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:40Z|00741|binding|INFO|Releasing lport 940a1442-b0ab-49a2-87e8-750659cdda8d from this chassis (sb_readonly=0)
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.833 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.850 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.851 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.852 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.853 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa47c91-3565-4bf8-ba65-2e554c803952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.854 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-e22d6ddc-0339-4395-bc21-95081825f05b
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID e22d6ddc-0339-4395-bc21-95081825f05b
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:08:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:08:40.855 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'env', 'PROCESS_TAG=haproxy-e22d6ddc-0339-4395-bc21-95081825f05b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e22d6ddc-0339-4395-bc21-95081825f05b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.864 225859 DEBUG nova.compute.manager [req-98393fd6-b7f3-40f5-8cde-f4754969fef9 req-314d61ff-1164-49d2-b5d3-2fc2ecf99a2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.864 225859 DEBUG oslo_concurrency.lockutils [req-98393fd6-b7f3-40f5-8cde-f4754969fef9 req-314d61ff-1164-49d2-b5d3-2fc2ecf99a2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.865 225859 DEBUG oslo_concurrency.lockutils [req-98393fd6-b7f3-40f5-8cde-f4754969fef9 req-314d61ff-1164-49d2-b5d3-2fc2ecf99a2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.865 225859 DEBUG oslo_concurrency.lockutils [req-98393fd6-b7f3-40f5-8cde-f4754969fef9 req-314d61ff-1164-49d2-b5d3-2fc2ecf99a2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:40 np0005588919 nova_compute[225855]: 2026-01-20 15:08:40.865 225859 DEBUG nova.compute.manager [req-98393fd6-b7f3-40f5-8cde-f4754969fef9 req-314d61ff-1164-49d2-b5d3-2fc2ecf99a2a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Processing event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.115 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921721.1150274, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.116 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Started (Lifecycle Event)#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.119 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.123 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.127 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance spawned successfully.#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.128 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.156 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.162 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.166 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.166 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.167 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.167 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.168 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.168 225859 DEBUG nova.virt.libvirt.driver [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.197 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.199 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921721.1183376, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.199 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.222 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.226 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921721.1220424, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.227 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.231 225859 INFO nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Took 6.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.232 225859 DEBUG nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.289 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.292 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.333 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.334 225859 INFO nova.compute.manager [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Took 7.64 seconds to build instance.#033[00m
Jan 20 10:08:41 np0005588919 podman[295439]: 2026-01-20 15:08:41.239833326 +0000 UTC m=+0.022334925 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:08:41 np0005588919 nova_compute[225855]: 2026-01-20 15:08:41.369 225859 DEBUG oslo_concurrency.lockutils [None req-e21a5016-04f7-439f-bc34-054eb2c5a89e c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:41 np0005588919 podman[295439]: 2026-01-20 15:08:41.759232313 +0000 UTC m=+0.541733892 container create e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:08:41 np0005588919 systemd[1]: Started libpod-conmon-e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a.scope.
Jan 20 10:08:41 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:08:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2117d9a12e609e61fa38da0417b2e4ee38bd7ca4efd01c6c08034b29e0c123da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:08:41 np0005588919 podman[295439]: 2026-01-20 15:08:41.865078946 +0000 UTC m=+0.647580545 container init e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:08:41 np0005588919 podman[295439]: 2026-01-20 15:08:41.872334242 +0000 UTC m=+0.654835821 container start e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:08:41 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [NOTICE]   (295458) : New worker (295460) forked
Jan 20 10:08:41 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [NOTICE]   (295458) : Loading success.
Jan 20 10:08:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:42.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:42 np0005588919 nova_compute[225855]: 2026-01-20 15:08:42.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:08:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:42.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.015 225859 DEBUG nova.compute.manager [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.016 225859 DEBUG oslo_concurrency.lockutils [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.016 225859 DEBUG oslo_concurrency.lockutils [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.016 225859 DEBUG oslo_concurrency.lockutils [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.017 225859 DEBUG nova.compute.manager [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:08:43 np0005588919 nova_compute[225855]: 2026-01-20 15:08:43.017 225859 WARNING nova.compute.manager [req-06a3546f-1a29-404b-8f56-3c72d2b2d3e3 req-1d2d0d77-c80c-494a-96eb-2c23702f2ab1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:08:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:44.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:44 np0005588919 nova_compute[225855]: 2026-01-20 15:08:44.483 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:44.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:46.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:46.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:47 np0005588919 podman[295471]: 2026-01-20 15:08:47.014639327 +0000 UTC m=+0.054595190 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:08:47 np0005588919 nova_compute[225855]: 2026-01-20 15:08:47.523 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.263 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.264 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.311 225859 DEBUG nova.objects.instance [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'flavor' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.423 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:48.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.865 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.866 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:48 np0005588919 nova_compute[225855]: 2026-01-20 15:08:48.866 225859 INFO nova.compute.manager [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Attaching volume 1ae2be75-c922-4458-bd11-a97b4f6fdd2b to /dev/vdb#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.038 225859 DEBUG os_brick.utils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.040 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.051 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.051 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[30906319-46d1-4a52-b966-f6c52bc6837e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.052 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.062 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.063 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[10182486-989b-4ca5-9c4e-e718d0207583]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.064 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.073 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.073 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[1c858e3a-b220-428e-8825-faa332466374]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.075 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[6ecc4bae-375d-41a8-bacb-45776d7802bb]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.076 225859 DEBUG oslo_concurrency.processutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.108 225859 DEBUG oslo_concurrency.processutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.111 225859 DEBUG os_brick.initiator.connectors.lightos [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.112 225859 DEBUG os_brick.initiator.connectors.lightos [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.112 225859 DEBUG os_brick.initiator.connectors.lightos [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.112 225859 DEBUG os_brick.utils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] <== get_connector_properties: return (73ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.113 225859 DEBUG nova.virt.block_device [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating existing volume attachment record: 381cb790-ccf8-436f-94a0-eae0e8e507cc _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.486 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 10:08:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 20 10:08:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 20 10:08:49 np0005588919 nova_compute[225855]: 2026-01-20 15:08:49.984 225859 DEBUG nova.objects.instance [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'flavor' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.006 225859 DEBUG nova.virt.libvirt.driver [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Attempting to attach volume 1ae2be75-c922-4458-bd11-a97b4f6fdd2b with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.008 225859 DEBUG nova.virt.libvirt.guest [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-1ae2be75-c922-4458-bd11-a97b4f6fdd2b">
Jan 20 10:08:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 10:08:50 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  </auth>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:08:50 np0005588919 nova_compute[225855]:  <serial>1ae2be75-c922-4458-bd11-a97b4f6fdd2b</serial>
Jan 20 10:08:50 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:08:50 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:08:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:50.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.444 225859 DEBUG nova.virt.libvirt.driver [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.444 225859 DEBUG nova.virt.libvirt.driver [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.445 225859 DEBUG nova.virt.libvirt.driver [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.445 225859 DEBUG nova.virt.libvirt.driver [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] No VIF found with MAC fa:16:3e:7f:85:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:08:50 np0005588919 nova_compute[225855]: 2026-01-20 15:08:50.643 225859 DEBUG oslo_concurrency.lockutils [None req-7ca13fec-c26a-4224-9306-ab6ddca4a995 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:50.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:51 np0005588919 nova_compute[225855]: 2026-01-20 15:08:51.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:08:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:52.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.494 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.495 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.495 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.495 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:08:52 np0005588919 nova_compute[225855]: 2026-01-20 15:08:52.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:52.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.001 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [{"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.017 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-a25af5a3-096f-4363-842e-d960c22eb16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.018 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.018 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:54Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:08:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:54Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:08:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:54.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.489 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:54 np0005588919 NetworkManager[49104]: <info>  [1768921734.5348] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Jan 20 10:08:54 np0005588919 NetworkManager[49104]: <info>  [1768921734.5355] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.535 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:54Z|00742|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:08:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:08:54Z|00743|binding|INFO|Releasing lport 940a1442-b0ab-49a2-87e8-750659cdda8d from this chassis (sb_readonly=0)
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.741 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:54.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.839 225859 DEBUG nova.compute.manager [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.840 225859 DEBUG nova.compute.manager [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.840 225859 DEBUG oslo_concurrency.lockutils [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.841 225859 DEBUG oslo_concurrency.lockutils [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:54 np0005588919 nova_compute[225855]: 2026-01-20 15:08:54.841 225859 DEBUG nova.network.neutron [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.254 225859 DEBUG nova.network.neutron [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.254 225859 DEBUG nova.network.neutron [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.279 225859 DEBUG oslo_concurrency.lockutils [req-8dfc7e61-9b50-4a25-8dfc-e9b38231d0cb req-d1768dd6-bd3e-46e5-88e5-7a0e67aa7feb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.389 225859 DEBUG nova.compute.manager [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.390 225859 DEBUG nova.compute.manager [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.390 225859 DEBUG oslo_concurrency.lockutils [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.390 225859 DEBUG oslo_concurrency.lockutils [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:56 np0005588919 nova_compute[225855]: 2026-01-20 15:08:56.390 225859 DEBUG nova.network.neutron [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:56.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:56.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.355 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.382 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.383 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.383 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.383 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.383 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.722 225859 DEBUG nova.network.neutron [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.723 225859 DEBUG nova.network.neutron [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.744 225859 DEBUG oslo_concurrency.lockutils [req-475bc903-102a-46f8-b5cf-ffaa4fb4b5f9 req-c39949d2-7250-4d7b-a7ef-4b464f17cde7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:08:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:08:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3934644950' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.856 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.941 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.941 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.942 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.945 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.946 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:57 np0005588919 nova_compute[225855]: 2026-01-20 15:08:57.946 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.101 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.102 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3937MB free_disk=20.831497192382812GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.102 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.102 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:08:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.388 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance a25af5a3-096f-4363-842e-d960c22eb16b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.389 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.389 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.390 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:08:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:08:58.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.483 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.511 225859 DEBUG nova.compute.manager [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.512 225859 DEBUG nova.compute.manager [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.512 225859 DEBUG oslo_concurrency.lockutils [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.512 225859 DEBUG oslo_concurrency.lockutils [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.512 225859 DEBUG nova.network.neutron [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.515 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.515 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.534 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.574 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:08:58 np0005588919 nova_compute[225855]: 2026-01-20 15:08:58.641 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:08:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:08:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:08:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:08:58.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:08:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:08:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/698210959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.071 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.076 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.092 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.113 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.114 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:08:59 np0005588919 nova_compute[225855]: 2026-01-20 15:08:59.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.380 225859 DEBUG nova.network.neutron [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.381 225859 DEBUG nova.network.neutron [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.398 225859 DEBUG oslo_concurrency.lockutils [req-ec3380e5-2c88-4197-b167-66b752108d0c req-1beb8c9b-b757-414e-9793-26b0fd2d09cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:00.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:00.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.938 225859 DEBUG nova.compute.manager [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.939 225859 DEBUG nova.compute.manager [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.939 225859 DEBUG oslo_concurrency.lockutils [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.939 225859 DEBUG oslo_concurrency.lockutils [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:00 np0005588919 nova_compute[225855]: 2026-01-20 15:09:00.939 225859 DEBUG nova.network.neutron [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:01 np0005588919 nova_compute[225855]: 2026-01-20 15:09:01.095 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:02 np0005588919 nova_compute[225855]: 2026-01-20 15:09:02.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:02.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:02 np0005588919 nova_compute[225855]: 2026-01-20 15:09:02.528 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:02.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:02 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:03 np0005588919 podman[295753]: 2026-01-20 15:09:03.042930806 +0000 UTC m=+0.087023910 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:09:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:03 np0005588919 nova_compute[225855]: 2026-01-20 15:09:03.732 225859 DEBUG nova.network.neutron [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:03 np0005588919 nova_compute[225855]: 2026-01-20 15:09:03.733 225859 DEBUG nova.network.neutron [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:03 np0005588919 nova_compute[225855]: 2026-01-20 15:09:03.776 225859 DEBUG oslo_concurrency.lockutils [req-434d3210-55cb-47db-8f9e-b2a322b7290f req-f877106e-1550-45aa-a162-785d1d8de159 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:09:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:09:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:04.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:04 np0005588919 nova_compute[225855]: 2026-01-20 15:09:04.495 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 20 10:09:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:04.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.149 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.149 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.150 225859 INFO nova.compute.manager [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Rebooting instance#033[00m
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.166 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.167 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:05 np0005588919 nova_compute[225855]: 2026-01-20 15:09:05.167 225859 DEBUG nova.network.neutron [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:09:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.305 225859 DEBUG nova.network.neutron [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.377 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.379 225859 DEBUG nova.compute.manager [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:06.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:06 np0005588919 kernel: tapc6bf5189-ce (unregistering): left promiscuous mode
Jan 20 10:09:06 np0005588919 NetworkManager[49104]: <info>  [1768921746.5747] device (tapc6bf5189-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:09:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:06Z|00744|binding|INFO|Releasing lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf from this chassis (sb_readonly=0)
Jan 20 10:09:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:06Z|00745|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf down in Southbound
Jan 20 10:09:06 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:06Z|00746|binding|INFO|Removing iface tapc6bf5189-ce ovn-installed in OVS
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.597 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b e405f81b-5d97-4611-81c1-7315a012415b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.599 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b unbound from our chassis#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.601 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e22d6ddc-0339-4395-bc21-95081825f05b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.603 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc27be8-f572-4373-ace5-6ea57c2a8f12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.603 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b namespace which is not needed anymore#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.606 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 20 10:09:06 np0005588919 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000aa.scope: Consumed 14.398s CPU time.
Jan 20 10:09:06 np0005588919 systemd-machined[194361]: Machine qemu-87-instance-000000aa terminated.
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [NOTICE]   (295458) : haproxy version is 2.8.14-c23fe91
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [NOTICE]   (295458) : path to executable is /usr/sbin/haproxy
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [WARNING]  (295458) : Exiting Master process...
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [WARNING]  (295458) : Exiting Master process...
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [ALERT]    (295458) : Current worker (295460) exited with code 143 (Terminated)
Jan 20 10:09:06 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[295454]: [WARNING]  (295458) : All workers exited. Exiting... (0)
Jan 20 10:09:06 np0005588919 systemd[1]: libpod-e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a.scope: Deactivated successfully.
Jan 20 10:09:06 np0005588919 podman[295804]: 2026-01-20 15:09:06.744094481 +0000 UTC m=+0.052975084 container died e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.744 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance destroyed successfully.#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.744 225859 DEBUG nova.objects.instance [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'resources' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.764 225859 DEBUG nova.virt.libvirt.vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:09:06Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.764 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.765 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.765 225859 DEBUG os_vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.767 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.768 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6bf5189-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a-userdata-shm.mount: Deactivated successfully.
Jan 20 10:09:06 np0005588919 systemd[1]: var-lib-containers-storage-overlay-2117d9a12e609e61fa38da0417b2e4ee38bd7ca4efd01c6c08034b29e0c123da-merged.mount: Deactivated successfully.
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.779 225859 INFO os_vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce')#033[00m
Jan 20 10:09:06 np0005588919 podman[295804]: 2026-01-20 15:09:06.790014164 +0000 UTC m=+0.098894747 container cleanup e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.790 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start _get_guest_xml network_info=[{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5b64c953-6df3-45a3-ae28-e419ba117bb2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '5b64c953-6df3-45a3-ae28-e419ba117bb2'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-1ae2be75-c922-4458-bd11-a97b4f6fdd2b', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '1ae2be75-c922-4458-bd11-a97b4f6fdd2b', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'attached_at': '', 'detached_at': '', 'volume_id': '1ae2be75-c922-4458-bd11-a97b4f6fdd2b', 'serial': '1ae2be75-c922-4458-bd11-a97b4f6fdd2b'}, 'guest_format': None, 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': '381cb790-ccf8-436f-94a0-eae0e8e507cc', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.793 225859 WARNING nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.798 225859 DEBUG nova.virt.libvirt.host [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.799 225859 DEBUG nova.virt.libvirt.host [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:09:06 np0005588919 systemd[1]: libpod-conmon-e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a.scope: Deactivated successfully.
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.806 225859 DEBUG nova.virt.libvirt.host [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.807 225859 DEBUG nova.virt.libvirt.host [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.808 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.808 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5b64c953-6df3-45a3-ae28-e419ba117bb2,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.809 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.809 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.809 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.809 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.809 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.810 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.810 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.810 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.810 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.810 225859 DEBUG nova.virt.hardware [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.811 225859 DEBUG nova.objects.instance [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.844 225859 DEBUG oslo_concurrency.processutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:06 np0005588919 podman[295843]: 2026-01-20 15:09:06.85510454 +0000 UTC m=+0.044801582 container remove e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.861 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1f5161-1c8f-46c7-85be-aa260ebac476]: (4, ('Tue Jan 20 03:09:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b (e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a)\ne1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a\nTue Jan 20 03:09:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b (e1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a)\ne1b6e797dfad2f34daa91ed383fcf864e44f993bbf42f43b0de989cb9646429a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.862 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2e34013f-ed05-41cd-8e00-23162bc92362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.863 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape22d6ddc-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:06 np0005588919 kernel: tape22d6ddc-00: left promiscuous mode
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.872 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:06.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.882 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.885 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba16376-ca95-44df-bd02-07395494aff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.902 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ce012454-3e6a-415c-a515-abe5f08b2b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.904 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[774c73bb-b892-4cc8-920d-769a01aeccce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.922 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[776bbe1a-185d-45d2-956b-a97585e4ad16]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675952, 'reachable_time': 41606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295859, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 systemd[1]: run-netns-ovnmeta\x2de22d6ddc\x2d0339\x2d4395\x2dbc21\x2d95081825f05b.mount: Deactivated successfully.
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.925 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:09:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:06.926 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[4a81559c-b374-425d-80bb-d00fd2780cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.990 225859 DEBUG nova.compute.manager [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.991 225859 DEBUG oslo_concurrency.lockutils [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.991 225859 DEBUG oslo_concurrency.lockutils [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.992 225859 DEBUG oslo_concurrency.lockutils [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.992 225859 DEBUG nova.compute.manager [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:06 np0005588919 nova_compute[225855]: 2026-01-20 15:09:06.992 225859 WARNING nova.compute.manager [req-8f9ba387-f77e-4319-ba31-0407456e926b req-e124c674-28ca-4c70-a260-092c3f6aa93d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 20 10:09:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1328764846' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.305 225859 DEBUG oslo_concurrency.processutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.344 225859 DEBUG oslo_concurrency.processutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.529 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:09:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4222916982' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.815 225859 DEBUG oslo_concurrency.processutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.844 225859 DEBUG nova.virt.libvirt.vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:09:06Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.845 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.845 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.846 225859 DEBUG nova.objects.instance [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.859 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <uuid>b4c55640-85f9-4d75-a4df-6ee77b21ca73</uuid>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <name>instance-000000aa</name>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestMinimumBasicScenario-server-2033880413</nova:name>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:09:06</nova:creationTime>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:user uuid="c98bd3f0904e48efa524d598bcad85e9">tempest-TestMinimumBasicScenario-1665080150-project-member</nova:user>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:project uuid="5b43342be22543f79d4a56e26c6d0c96">tempest-TestMinimumBasicScenario-1665080150</nova:project>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="5b64c953-6df3-45a3-ae28-e419ba117bb2"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <nova:port uuid="c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="serial">b4c55640-85f9-4d75-a4df-6ee77b21ca73</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="uuid">b4c55640-85f9-4d75-a4df-6ee77b21ca73</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b4c55640-85f9-4d75-a4df-6ee77b21ca73_disk.config">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-1ae2be75-c922-4458-bd11-a97b4f6fdd2b">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <serial>1ae2be75-c922-4458-bd11-a97b4f6fdd2b</serial>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:7f:85:09"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <target dev="tapc6bf5189-ce"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73/console.log" append="off"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:09:07 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:09:07 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:09:07 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:09:07 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.860 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.861 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.861 225859 DEBUG nova.virt.libvirt.driver [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] skipping disk for instance-000000aa as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.862 225859 DEBUG nova.virt.libvirt.vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:09:06Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.862 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.863 225859 DEBUG nova.network.os_vif_util [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.863 225859 DEBUG os_vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.864 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.864 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.865 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.868 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.868 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6bf5189-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.869 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6bf5189-ce, col_values=(('external_ids', {'iface-id': 'c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:85:09', 'vm-uuid': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.870 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 NetworkManager[49104]: <info>  [1768921747.8712] manager: (tapc6bf5189-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.872 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.874 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.875 225859 INFO os_vif [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce')#033[00m
Jan 20 10:09:07 np0005588919 kernel: tapc6bf5189-ce: entered promiscuous mode
Jan 20 10:09:07 np0005588919 NetworkManager[49104]: <info>  [1768921747.9380] manager: (tapc6bf5189-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 20 10:09:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:07Z|00747|binding|INFO|Claiming lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for this chassis.
Jan 20 10:09:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:07Z|00748|binding|INFO|c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf: Claiming fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.938 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 systemd-udevd[295783]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:09:07 np0005588919 NetworkManager[49104]: <info>  [1768921747.9498] device (tapc6bf5189-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:09:07 np0005588919 NetworkManager[49104]: <info>  [1768921747.9511] device (tapc6bf5189-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.952 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '6', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b e405f81b-5d97-4611-81c1-7315a012415b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.954 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b bound to our chassis#033[00m
Jan 20 10:09:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:07Z|00749|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf ovn-installed in OVS
Jan 20 10:09:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:07Z|00750|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf up in Southbound
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.956 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e22d6ddc-0339-4395-bc21-95081825f05b#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.957 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 nova_compute[225855]: 2026-01-20 15:09:07.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.967 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[49f3d1fe-9508-475c-b9b7-00ffbae96b29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.968 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape22d6ddc-01 in ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.969 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape22d6ddc-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.969 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e2607557-fb72-4d71-900e-8bc9f9018ca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588919 systemd-machined[194361]: New machine qemu-88-instance-000000aa.
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.970 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[54d93927-f86a-43c1-829c-2cfd1db85208]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:07.980 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[05978d9c-98c3-4f89-96c1-697230198f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:07 np0005588919 systemd[1]: Started Virtual Machine qemu-88-instance-000000aa.
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.001 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcc843b-afde-4e0f-a73f-5c860feefaf0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.030 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0fac90-dfdf-4f42-b40c-1899ece237de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 NetworkManager[49104]: <info>  [1768921748.0400] manager: (tape22d6ddc-00): new Veth device (/org/freedesktop/NetworkManager/Devices/315)
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.039 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2adc23-08ee-4865-9c55-1198c146ad78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.076 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[53c8e9d1-1682-4e32-815f-6aa7585fd50c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.079 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdac37e-c7c2-44eb-9000-b005fa83c85f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 NetworkManager[49104]: <info>  [1768921748.1007] device (tape22d6ddc-00): carrier: link connected
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.106 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1a122c-c306-41d9-90ac-ab4f87acbaa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.122 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9ba8eb-9d11-43c3-9f68-fdbf82b12cea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape22d6ddc-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:3f:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678702, 'reachable_time': 15877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295968, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.138 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e637428b-c5f6-4b5c-9298-58de1b6073ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:3f5c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678702, 'tstamp': 678702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295969, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.158 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[348ec13a-958f-4ea5-abb4-f41d630e573b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape22d6ddc-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:3f:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678702, 'reachable_time': 15877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295970, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.186 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[857e585e-0f9f-4854-9e15-3694e3c6d0c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.244 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6a65875f-20cd-49a0-8c3b-315dc0ecc5a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.245 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape22d6ddc-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.245 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.245 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape22d6ddc-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:08 np0005588919 NetworkManager[49104]: <info>  [1768921748.2480] manager: (tape22d6ddc-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 20 10:09:08 np0005588919 kernel: tape22d6ddc-00: entered promiscuous mode
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.251 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape22d6ddc-00, col_values=(('external_ids', {'iface-id': '940a1442-b0ab-49a2-87e8-750659cdda8d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.252 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:08Z|00751|binding|INFO|Releasing lport 940a1442-b0ab-49a2-87e8-750659cdda8d from this chassis (sb_readonly=0)
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.267 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.269 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.270 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[03c002bd-4a81-4a2e-a577-f7eff7ac4346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.271 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-e22d6ddc-0339-4395-bc21-95081825f05b
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/e22d6ddc-0339-4395-bc21-95081825f05b.pid.haproxy
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID e22d6ddc-0339-4395-bc21-95081825f05b
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:09:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:08.271 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'env', 'PROCESS_TAG=haproxy-e22d6ddc-0339-4395-bc21-95081825f05b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e22d6ddc-0339-4395-bc21-95081825f05b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:09:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:08.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.559 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for b4c55640-85f9-4d75-a4df-6ee77b21ca73 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.561 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921748.5593107, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.561 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.563 225859 DEBUG nova.compute.manager [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.566 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance rebooted successfully.#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.566 225859 DEBUG nova.compute.manager [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.585 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.588 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.606 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.607 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921748.5600595, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.607 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Started (Lifecycle Event)#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.612 225859 DEBUG oslo_concurrency.lockutils [None req-67061584-2fa7-4fe7-a0eb-ebb74d618309 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.624 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:08 np0005588919 nova_compute[225855]: 2026-01-20 15:09:08.627 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:09:08 np0005588919 podman[296062]: 2026-01-20 15:09:08.640546559 +0000 UTC m=+0.066831067 container create 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:09:08 np0005588919 podman[296062]: 2026-01-20 15:09:08.596266783 +0000 UTC m=+0.022551311 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:09:08 np0005588919 systemd[1]: Started libpod-conmon-5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe.scope.
Jan 20 10:09:08 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:09:08 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54aa6a90ad0d5f14e5fd3fa3051693ab8a0283f2ca8d04c2beba4ab7db8a43eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:09:08 np0005588919 podman[296062]: 2026-01-20 15:09:08.75475293 +0000 UTC m=+0.181037468 container init 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:09:08 np0005588919 podman[296062]: 2026-01-20 15:09:08.760772931 +0000 UTC m=+0.187057439 container start 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 10:09:08 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [NOTICE]   (296081) : New worker (296083) forked
Jan 20 10:09:08 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [NOTICE]   (296081) : Loading success.
Jan 20 10:09:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:08.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.081 225859 DEBUG nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.081 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.083 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.084 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.084 225859 DEBUG nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.084 225859 WARNING nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.085 225859 DEBUG nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.085 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.085 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.086 225859 DEBUG oslo_concurrency.lockutils [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.086 225859 DEBUG nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:09 np0005588919 nova_compute[225855]: 2026-01-20 15:09:09.086 225859 WARNING nova.compute.manager [req-3ed92bf2-7d06-4c1e-9650-c6f92520b8cd req-43575bf7-281c-42d5-98d9-d51d5656f8c7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:10.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.196 225859 DEBUG nova.compute.manager [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.197 225859 DEBUG oslo_concurrency.lockutils [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.197 225859 DEBUG oslo_concurrency.lockutils [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.197 225859 DEBUG oslo_concurrency.lockutils [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.197 225859 DEBUG nova.compute.manager [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.197 225859 WARNING nova.compute.manager [req-b11fa76b-94ff-4357-8541-b1e664e64da0 req-f129f790-59ca-4771-9c48-edb270971680 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:09:11 np0005588919 nova_compute[225855]: 2026-01-20 15:09:11.380 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:09:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 20 10:09:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:09:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:12.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:12 np0005588919 nova_compute[225855]: 2026-01-20 15:09:12.532 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:12 np0005588919 nova_compute[225855]: 2026-01-20 15:09:12.870 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:12.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:13Z|00752|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:09:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:13Z|00753|binding|INFO|Releasing lport 940a1442-b0ab-49a2-87e8-750659cdda8d from this chassis (sb_readonly=0)
Jan 20 10:09:13 np0005588919 nova_compute[225855]: 2026-01-20 15:09:13.114 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:13 np0005588919 nova_compute[225855]: 2026-01-20 15:09:13.376 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:09:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/668410972' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:09:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:09:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/668410972' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:09:14 np0005588919 nova_compute[225855]: 2026-01-20 15:09:14.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:14 np0005588919 nova_compute[225855]: 2026-01-20 15:09:14.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:09:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:14.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:14.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:16.428 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:16.429 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:16.429 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:16.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 20 10:09:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:16.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:17 np0005588919 nova_compute[225855]: 2026-01-20 15:09:17.534 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:17 np0005588919 nova_compute[225855]: 2026-01-20 15:09:17.873 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:18 np0005588919 podman[296148]: 2026-01-20 15:09:18.041673642 +0000 UTC m=+0.076599945 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:09:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:18.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:18.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:19Z|00754|binding|INFO|Releasing lport b19d6956-fc8d-42c8-af98-d0f2fe9fe3a3 from this chassis (sb_readonly=0)
Jan 20 10:09:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:19Z|00755|binding|INFO|Releasing lport 940a1442-b0ab-49a2-87e8-750659cdda8d from this chassis (sb_readonly=0)
Jan 20 10:09:19 np0005588919 nova_compute[225855]: 2026-01-20 15:09:19.385 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:20.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:20.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:21.385 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:21 np0005588919 nova_compute[225855]: 2026-01-20 15:09:21.385 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:21.387 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:09:21 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:21Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:09:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:22.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:22 np0005588919 nova_compute[225855]: 2026-01-20 15:09:22.536 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:22 np0005588919 nova_compute[225855]: 2026-01-20 15:09:22.875 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:22.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:24.389 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:24.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:24.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:26.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.538 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.878 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.918 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.951 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid a25af5a3-096f-4363-842e-d960c22eb16b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.951 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.951 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.952 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.952 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.952 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.986 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:27 np0005588919 nova_compute[225855]: 2026-01-20 15:09:27.987 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:28 np0005588919 nova_compute[225855]: 2026-01-20 15:09:28.622 225859 DEBUG nova.compute.manager [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:28 np0005588919 nova_compute[225855]: 2026-01-20 15:09:28.623 225859 DEBUG nova.compute.manager [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:28 np0005588919 nova_compute[225855]: 2026-01-20 15:09:28.624 225859 DEBUG oslo_concurrency.lockutils [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:28 np0005588919 nova_compute[225855]: 2026-01-20 15:09:28.624 225859 DEBUG oslo_concurrency.lockutils [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:28 np0005588919 nova_compute[225855]: 2026-01-20 15:09:28.624 225859 DEBUG nova.network.neutron [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:28.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:30 np0005588919 nova_compute[225855]: 2026-01-20 15:09:30.368 225859 DEBUG nova.network.neutron [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:30 np0005588919 nova_compute[225855]: 2026-01-20 15:09:30.368 225859 DEBUG nova.network.neutron [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:30 np0005588919 nova_compute[225855]: 2026-01-20 15:09:30.384 225859 DEBUG oslo_concurrency.lockutils [req-df18a53a-0389-4506-b7d6-2153b1ed8c8e req-0d9f85fd-1a20-4876-a6f6-6366d9e9dc27 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:09:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:30.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:09:31 np0005588919 nova_compute[225855]: 2026-01-20 15:09:31.561 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.072 225859 DEBUG nova.compute.manager [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.073 225859 DEBUG nova.compute.manager [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing instance network info cache due to event network-changed-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.073 225859 DEBUG oslo_concurrency.lockutils [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.073 225859 DEBUG oslo_concurrency.lockutils [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.073 225859 DEBUG nova.network.neutron [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Refreshing network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:09:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.542 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:32 np0005588919 nova_compute[225855]: 2026-01-20 15:09:32.880 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:32.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.312 225859 DEBUG nova.network.neutron [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updated VIF entry in instance network info cache for port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.312 225859 DEBUG nova.network.neutron [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [{"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.331 225859 DEBUG oslo_concurrency.lockutils [req-574ed8e3-7abb-4edd-be36-a9488ad56efa req-b02e5cbb-f78b-448c-9ac0-657b6fe98137 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b4c55640-85f9-4d75-a4df-6ee77b21ca73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.822 225859 DEBUG oslo_concurrency.lockutils [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.823 225859 DEBUG oslo_concurrency.lockutils [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:33 np0005588919 nova_compute[225855]: 2026-01-20 15:09:33.837 225859 INFO nova.compute.manager [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Detaching volume 1ae2be75-c922-4458-bd11-a97b4f6fdd2b#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.063 225859 INFO nova.virt.block_device [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Attempting to driver detach volume 1ae2be75-c922-4458-bd11-a97b4f6fdd2b from mountpoint /dev/vdb#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.074 225859 DEBUG nova.virt.libvirt.driver [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Attempting to detach device vdb from instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.075 225859 DEBUG nova.virt.libvirt.guest [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-1ae2be75-c922-4458-bd11-a97b4f6fdd2b">
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <serial>1ae2be75-c922-4458-bd11-a97b4f6fdd2b</serial>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:09:34 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.084 225859 INFO nova.virt.libvirt.driver [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully detached device vdb from instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 from the persistent domain config.#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.084 225859 DEBUG nova.virt.libvirt.driver [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.085 225859 DEBUG nova.virt.libvirt.guest [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-1ae2be75-c922-4458-bd11-a97b4f6fdd2b">
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <serial>1ae2be75-c922-4458-bd11-a97b4f6fdd2b</serial>
Jan 20 10:09:34 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:09:34 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:09:34 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.145 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768921774.1445324, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.147 225859 DEBUG nova.virt.libvirt.driver [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.149 225859 INFO nova.virt.libvirt.driver [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully detached device vdb from instance b4c55640-85f9-4d75-a4df-6ee77b21ca73 from the live domain config.#033[00m
Jan 20 10:09:34 np0005588919 podman[296226]: 2026-01-20 15:09:34.222065387 +0000 UTC m=+0.132439069 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.396 225859 DEBUG nova.objects.instance [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'flavor' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:34 np0005588919 nova_compute[225855]: 2026-01-20 15:09:34.432 225859 DEBUG oslo_concurrency.lockutils [None req-80abe54b-873c-4099-951b-e62569f7f748 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:34.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:34.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:09:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2893274847' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:09:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:09:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2893274847' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:09:35 np0005588919 nova_compute[225855]: 2026-01-20 15:09:35.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:36.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:36.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.103 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.104 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.104 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.104 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.104 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.105 225859 INFO nova.compute.manager [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Terminating instance#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.106 225859 DEBUG nova.compute.manager [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:09:37 np0005588919 kernel: tapc6bf5189-ce (unregistering): left promiscuous mode
Jan 20 10:09:37 np0005588919 NetworkManager[49104]: <info>  [1768921777.1550] device (tapc6bf5189-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.164 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00756|binding|INFO|Releasing lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf from this chassis (sb_readonly=0)
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00757|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf down in Southbound
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00758|binding|INFO|Removing iface tapc6bf5189-ce ovn-installed in OVS
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.169 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.173 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.174 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b unbound from our chassis#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.175 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e22d6ddc-0339-4395-bc21-95081825f05b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.177 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[37b6548c-324d-458e-842c-17315a49ebbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.177 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b namespace which is not needed anymore#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.182 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 20 10:09:37 np0005588919 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000aa.scope: Consumed 14.398s CPU time.
Jan 20 10:09:37 np0005588919 systemd-machined[194361]: Machine qemu-88-instance-000000aa terminated.
Jan 20 10:09:37 np0005588919 kernel: tapc6bf5189-ce: entered promiscuous mode
Jan 20 10:09:37 np0005588919 kernel: tapc6bf5189-ce (unregistering): left promiscuous mode
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.330 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00759|binding|INFO|Claiming lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for this chassis.
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00760|binding|INFO|c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf: Claiming fa:16:3e:7f:85:09 10.100.0.3
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.337 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.346 225859 INFO nova.virt.libvirt.driver [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Instance destroyed successfully.#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.347 225859 DEBUG nova.objects.instance [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lazy-loading 'resources' on Instance uuid b4c55640-85f9-4d75-a4df-6ee77b21ca73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [NOTICE]   (296081) : haproxy version is 2.8.14-c23fe91
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [NOTICE]   (296081) : path to executable is /usr/sbin/haproxy
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [WARNING]  (296081) : Exiting Master process...
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [WARNING]  (296081) : Exiting Master process...
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [ALERT]    (296081) : Current worker (296083) exited with code 143 (Terminated)
Jan 20 10:09:37 np0005588919 neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b[296077]: [WARNING]  (296081) : All workers exited. Exiting... (0)
Jan 20 10:09:37 np0005588919 systemd[1]: libpod-5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe.scope: Deactivated successfully.
Jan 20 10:09:37 np0005588919 conmon[296077]: conmon 5934c01e984de0e040eb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe.scope/container/memory.events
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00761|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf ovn-installed in OVS
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00762|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf up in Southbound
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.352 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00763|binding|INFO|Releasing lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf from this chassis (sb_readonly=1)
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00764|if_status|INFO|Dropped 2 log messages in last 939 seconds (most recently, 939 seconds ago) due to excessive rate
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00765|if_status|INFO|Not setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf down as sb is readonly
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00766|binding|INFO|Removing iface tapc6bf5189-ce ovn-installed in OVS
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00767|binding|INFO|Releasing lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf from this chassis (sb_readonly=0)
Jan 20 10:09:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:09:37Z|00768|binding|INFO|Setting lport c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf down in Southbound
Jan 20 10:09:37 np0005588919 podman[296278]: 2026-01-20 15:09:37.360759223 +0000 UTC m=+0.084196330 container died 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.364 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:85:09 10.100.0.3'], port_security=['fa:16:3e:7f:85:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b4c55640-85f9-4d75-a4df-6ee77b21ca73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e22d6ddc-0339-4395-bc21-95081825f05b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b43342be22543f79d4a56e26c6d0c96', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8514424f-703f-4374-a78e-584f6e7c233b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c19619-1e7e-40ee-be83-c9dbc347543e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.364 225859 DEBUG nova.virt.libvirt.vif [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:08:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2033880413',display_name='tempest-TestMinimumBasicScenario-server-2033880413',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-2033880413',id=170,image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWye8KRiKbxhnt7Xo8hUChshePiRtRRGKKbmjGtRpAbQNhUcsWAOe/4okup4yaafm+06AxmRjgJ9R8sVLFUEsSHiOZRgv3dFKZL11GpIpeu6UGBzzNxvi+GaA/Guzx6LQ==',key_name='tempest-TestMinimumBasicScenario-1345705235',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b43342be22543f79d4a56e26c6d0c96',ramdisk_id='',reservation_id='r-srebood0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5b64c953-6df3-45a3-ae28-e419ba117bb2',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1665080150',owner_user_name='tempest-TestMinimumBasicScenario-1665080150-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:09:08Z,user_data=None,user_id='c98bd3f0904e48efa524d598bcad85e9',uuid=b4c55640-85f9-4d75-a4df-6ee77b21ca73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.365 225859 DEBUG nova.network.os_vif_util [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converting VIF {"id": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "address": "fa:16:3e:7f:85:09", "network": {"id": "e22d6ddc-0339-4395-bc21-95081825f05b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1496899124-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b43342be22543f79d4a56e26c6d0c96", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6bf5189-ce", "ovs_interfaceid": "c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.365 225859 DEBUG nova.network.os_vif_util [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.366 225859 DEBUG os_vif [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.368 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.369 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6bf5189-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.370 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.370 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.371 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.374 225859 INFO os_vif [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:85:09,bridge_name='br-int',has_traffic_filtering=True,id=c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf,network=Network(e22d6ddc-0339-4395-bc21-95081825f05b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6bf5189-ce')#033[00m
Jan 20 10:09:37 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe-userdata-shm.mount: Deactivated successfully.
Jan 20 10:09:37 np0005588919 systemd[1]: var-lib-containers-storage-overlay-54aa6a90ad0d5f14e5fd3fa3051693ab8a0283f2ca8d04c2beba4ab7db8a43eb-merged.mount: Deactivated successfully.
Jan 20 10:09:37 np0005588919 podman[296278]: 2026-01-20 15:09:37.40118176 +0000 UTC m=+0.124618857 container cleanup 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:09:37 np0005588919 systemd[1]: libpod-conmon-5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe.scope: Deactivated successfully.
Jan 20 10:09:37 np0005588919 podman[296329]: 2026-01-20 15:09:37.468476209 +0000 UTC m=+0.045210594 container remove 5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.474 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2b253d-0bf1-4546-9fa6-37144c251062]: (4, ('Tue Jan 20 03:09:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b (5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe)\n5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe\nTue Jan 20 03:09:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b (5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe)\n5934c01e984de0e040eb3c06a49fe9ae7a05253acf5b998f13a7c3ee6661ebfe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.476 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[730e7517-c85d-48a4-90fe-ba6c112a1d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.477 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape22d6ddc-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.479 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 kernel: tape22d6ddc-00: left promiscuous mode
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.492 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.495 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[27cd1494-5c25-4285-a86c-de57cedc592d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.516 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[42f1f476-d145-4e7c-bf1d-9e331725d93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.517 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[90c156a7-1b2b-4ade-9dc5-01ca1af1b967]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.532 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b25b9493-4d34-47a1-ad3e-b564bb65cfd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678695, 'reachable_time': 23313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296347, 'error': None, 'target': 'ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 systemd[1]: run-netns-ovnmeta\x2de22d6ddc\x2d0339\x2d4395\x2dbc21\x2d95081825f05b.mount: Deactivated successfully.
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.537 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e22d6ddc-0339-4395-bc21-95081825f05b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.537 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[6d47169f-f777-4c85-b0e8-7f1719e3935d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.538 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b unbound from our chassis#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.539 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e22d6ddc-0339-4395-bc21-95081825f05b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.540 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6037a122-024f-407b-b112-c9121db53780]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.540 140354 INFO neutron.agent.ovn.metadata.agent [-] Port c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf in datapath e22d6ddc-0339-4395-bc21-95081825f05b unbound from our chassis#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.541 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e22d6ddc-0339-4395-bc21-95081825f05b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:09:37 np0005588919 nova_compute[225855]: 2026-01-20 15:09:37.542 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:09:37.542 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[91c722a9-b93e-4453-b255-8d2a5a62d547]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:09:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.490 225859 DEBUG nova.compute.manager [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.491 225859 DEBUG oslo_concurrency.lockutils [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.491 225859 DEBUG oslo_concurrency.lockutils [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.491 225859 DEBUG oslo_concurrency.lockutils [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.491 225859 DEBUG nova.compute.manager [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:38 np0005588919 nova_compute[225855]: 2026-01-20 15:09:38.492 225859 DEBUG nova.compute.manager [req-e72be83e-f1cf-4b25-8e0c-cc16bab90eb5 req-b631f6fa-04b0-48fe-8899-f009e5c6a38a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:09:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:38.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:38.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:39 np0005588919 nova_compute[225855]: 2026-01-20 15:09:39.989 225859 INFO nova.virt.libvirt.driver [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Deleting instance files /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73_del#033[00m
Jan 20 10:09:39 np0005588919 nova_compute[225855]: 2026-01-20 15:09:39.990 225859 INFO nova.virt.libvirt.driver [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Deletion of /var/lib/nova/instances/b4c55640-85f9-4d75-a4df-6ee77b21ca73_del complete#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.066 225859 INFO nova.compute.manager [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Took 2.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.067 225859 DEBUG oslo.service.loopingcall [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.067 225859 DEBUG nova.compute.manager [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.067 225859 DEBUG nova.network.neutron [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:09:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:40.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.679 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.680 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.680 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.680 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.680 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.681 225859 WARNING nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.681 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.681 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.681 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.681 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.682 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.682 225859 WARNING nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.682 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.682 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.682 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.683 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.683 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.683 225859 WARNING nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.683 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.684 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.684 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.684 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.684 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.684 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-unplugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.685 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.685 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.685 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.685 225859 DEBUG oslo_concurrency.lockutils [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.686 225859 DEBUG nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] No waiting events found dispatching network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:09:40 np0005588919 nova_compute[225855]: 2026-01-20 15:09:40.686 225859 WARNING nova.compute.manager [req-80e5f13a-04a8-4165-b9af-bcab7b57f19e req-6f7f5122-7ebd-44ae-afa4-a6fa28d58ae1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received unexpected event network-vif-plugged-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:09:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:40.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.229 225859 DEBUG nova.network.neutron [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.250 225859 INFO nova.compute.manager [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.309 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.310 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.342 225859 DEBUG nova.compute.manager [req-a578728c-527b-4bbd-a0e9-f2d202c54ad7 req-76634f58-7992-4a16-a902-63531ffc805b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Received event network-vif-deleted-c6bf5189-ce8b-4ff1-9eba-e3d4195a6cbf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.407 225859 DEBUG oslo_concurrency.processutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/230950978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.843 225859 DEBUG oslo_concurrency.processutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:41 np0005588919 nova_compute[225855]: 2026-01-20 15:09:41.850 225859 DEBUG nova.compute.provider_tree [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.019 225859 DEBUG nova.scheduler.client.report [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.050 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.740s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.072 225859 INFO nova.scheduler.client.report [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Deleted allocations for instance b4c55640-85f9-4d75-a4df-6ee77b21ca73#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.117 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.129 225859 DEBUG oslo_concurrency.lockutils [None req-7e715cb4-5d2d-4385-ad06-420ccbc20186 c98bd3f0904e48efa524d598bcad85e9 5b43342be22543f79d4a56e26c6d0c96 - - default default] Lock "b4c55640-85f9-4d75-a4df-6ee77b21ca73" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.394 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:42.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:42 np0005588919 nova_compute[225855]: 2026-01-20 15:09:42.544 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:42.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 20 10:09:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:44.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:44.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 20 10:09:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:46.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:46.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:47 np0005588919 nova_compute[225855]: 2026-01-20 15:09:47.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:47 np0005588919 nova_compute[225855]: 2026-01-20 15:09:47.546 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588919 nova_compute[225855]: 2026-01-20 15:09:48.230 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:48.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:48.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:49 np0005588919 podman[296427]: 2026-01-20 15:09:49.013817131 +0000 UTC m=+0.057518813 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:09:50 np0005588919 nova_compute[225855]: 2026-01-20 15:09:50.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:50 np0005588919 nova_compute[225855]: 2026-01-20 15:09:50.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:09:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:50.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:51 np0005588919 nova_compute[225855]: 2026-01-20 15:09:51.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 20 10:09:52 np0005588919 nova_compute[225855]: 2026-01-20 15:09:52.344 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921777.3435755, b4c55640-85f9-4d75-a4df-6ee77b21ca73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:09:52 np0005588919 nova_compute[225855]: 2026-01-20 15:09:52.345 225859 INFO nova.compute.manager [-] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:09:52 np0005588919 nova_compute[225855]: 2026-01-20 15:09:52.379 225859 DEBUG nova.compute.manager [None req-7749b0a5-add8-4d7b-a121-ccd929715148 - - - - - -] [instance: b4c55640-85f9-4d75-a4df-6ee77b21ca73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:09:52 np0005588919 nova_compute[225855]: 2026-01-20 15:09:52.456 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:52 np0005588919 nova_compute[225855]: 2026-01-20 15:09:52.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:52.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.369 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.370 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.371 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:54 np0005588919 nova_compute[225855]: 2026-01-20 15:09:54.371 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:09:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:54.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:09:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:54.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:56.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:56.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:57 np0005588919 nova_compute[225855]: 2026-01-20 15:09:57.458 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:57 np0005588919 nova_compute[225855]: 2026-01-20 15:09:57.551 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:09:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.374 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.374 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:09:58.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/984211214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.822 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.917 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.918 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588919 nova_compute[225855]: 2026-01-20 15:09:58.918 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000a8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:09:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:09:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:09:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:09:58.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.073 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.074 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4119MB free_disk=20.80986785888672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.075 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.075 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.165 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance a25af5a3-096f-4363-842e-d960c22eb16b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.165 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.165 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.211 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:09:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:09:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3843724838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.630 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.637 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.660 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.686 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:09:59 np0005588919 nova_compute[225855]: 2026-01-20 15:09:59.686 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:00.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:10:00 np0005588919 nova_compute[225855]: 2026-01-20 15:10:00.682 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:00.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:02 np0005588919 nova_compute[225855]: 2026-01-20 15:10:02.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:02 np0005588919 nova_compute[225855]: 2026-01-20 15:10:02.461 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:02.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:02 np0005588919 nova_compute[225855]: 2026-01-20 15:10:02.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:02.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.376 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.377 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.411 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.480 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.481 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.487 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.488 225859 INFO nova.compute.claims [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:10:03 np0005588919 nova_compute[225855]: 2026-01-20 15:10:03.606 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1931767600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.052 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.057 225859 DEBUG nova.compute.provider_tree [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.092 225859 DEBUG nova.scheduler.client.report [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.119 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.120 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.165 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.165 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.185 225859 INFO nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.202 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.250 225859 INFO nova.virt.block_device [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Booting with volume ccb7c984-4606-40ef-8fcd-a902f5382dee at /dev/vda#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.412 225859 DEBUG os_brick.utils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.414 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.425 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.425 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[97258e65-4038-4c1f-9d26-b80d1c2d80ef]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.427 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.437 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.437 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e00cf0-0f75-4654-992b-1edaa2e1bdf8]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.439 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.447 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.448 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc4c4aa-e00c-46af-bd34-9f310a8b507f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.449 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[52288b68-d62a-4b67-8b5a-be86a9f1b957]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.449 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.481 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "nvme version" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.483 225859 DEBUG os_brick.initiator.connectors.lightos [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.484 225859 DEBUG os_brick.initiator.connectors.lightos [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.484 225859 DEBUG os_brick.initiator.connectors.lightos [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.484 225859 DEBUG os_brick.utils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] <== get_connector_properties: return (71ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:10:04 np0005588919 nova_compute[225855]: 2026-01-20 15:10:04.485 225859 DEBUG nova.virt.block_device [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Updating existing volume attachment record: f4780d3e-d038-4466-9222-3d7730703f45 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:10:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:04.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:04.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:05 np0005588919 podman[296578]: 2026-01-20 15:10:05.040205215 +0000 UTC m=+0.088782570 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.154 225859 DEBUG nova.policy [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf422e55e158420cbdae75f07a3bb97a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a49638950e1543fa8e0d251af5479623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:05.722 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:05.723 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.727 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.728 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.728 225859 INFO nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Creating image(s)#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.729 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.729 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Ensure instance console log exists: /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.729 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.730 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.730 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:05 np0005588919 nova_compute[225855]: 2026-01-20 15:10:05.842 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Successfully created port: f9f19cf7-87f5-4dd4-a7be-78086c84e176 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:10:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:06.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:06.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.112 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Successfully updated port: f9f19cf7-87f5-4dd4-a7be-78086c84e176 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.124 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.125 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquired lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.125 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.219 225859 DEBUG nova.compute.manager [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-changed-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.219 225859 DEBUG nova.compute.manager [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Refreshing instance network info cache due to event network-changed-f9f19cf7-87f5-4dd4-a7be-78086c84e176. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.219 225859 DEBUG oslo_concurrency.lockutils [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.464 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:07 np0005588919 nova_compute[225855]: 2026-01-20 15:10:07.554 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:08 np0005588919 nova_compute[225855]: 2026-01-20 15:10:08.116 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:10:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:08.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:08.725 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:08.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:09 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 20 10:10:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:10.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.379 225859 DEBUG nova.network.neutron [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Updating instance_info_cache with network_info: [{"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.467 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Releasing lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.468 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Instance network_info: |[{"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.468 225859 DEBUG oslo_concurrency.lockutils [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.469 225859 DEBUG nova.network.neutron [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Refreshing network info cache for port f9f19cf7-87f5-4dd4-a7be-78086c84e176 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.473 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Start _get_guest_xml network_info=[{"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-ccb7c984-4606-40ef-8fcd-a902f5382dee', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'ccb7c984-4606-40ef-8fcd-a902f5382dee', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '9e852872-788c-4dac-b7fb-d76d67e7a84f', 'attached_at': '', 'detached_at': '', 'volume_id': 'ccb7c984-4606-40ef-8fcd-a902f5382dee', 'serial': 'ccb7c984-4606-40ef-8fcd-a902f5382dee'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'f4780d3e-d038-4466-9222-3d7730703f45', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.480 225859 WARNING nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.486 225859 DEBUG nova.virt.libvirt.host [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.487 225859 DEBUG nova.virt.libvirt.host [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.496 225859 DEBUG nova.virt.libvirt.host [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.497 225859 DEBUG nova.virt.libvirt.host [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.498 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.498 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.499 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.499 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.499 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.499 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.499 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.500 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.500 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.500 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.500 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.501 225859 DEBUG nova.virt.hardware [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.532 225859 DEBUG nova.storage.rbd_utils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:11 np0005588919 nova_compute[225855]: 2026-01-20 15:10:11.536 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:10:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1681359816' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.012 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.163 225859 DEBUG os_brick.encryptors [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Using volume encryption metadata '{'encryption_key_id': '349091f0-57cd-4e7a-b935-410f986b1500', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-ccb7c984-4606-40ef-8fcd-a902f5382dee', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'ccb7c984-4606-40ef-8fcd-a902f5382dee', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '9e852872-788c-4dac-b7fb-d76d67e7a84f', 'attached_at': '', 'detached_at': '', 'volume_id': 'ccb7c984-4606-40ef-8fcd-a902f5382dee', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.166 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.189 225859 DEBUG barbicanclient.v1.secrets [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/349091f0-57cd-4e7a-b935-410f986b1500 get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.190 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.228 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.228 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.255 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.255 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.280 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.280 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.300 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.301 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.325 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.325 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.379 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.379 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.423 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.423 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.443 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.444 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.467 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.468 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.494 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.497 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.497 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.530 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.530 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:12.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.552 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.552 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.558 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.572 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.572 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.595 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.596 225859 INFO barbicanclient.base [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Calculated Secrets uuid ref: secrets/349091f0-57cd-4e7a-b935-410f986b1500#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.615 225859 DEBUG barbicanclient.client [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.616 225859 DEBUG nova.virt.libvirt.host [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Secret XML: <secret ephemeral="no" private="no">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <usage type="volume">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <volume>ccb7c984-4606-40ef-8fcd-a902f5382dee</volume>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </usage>
Jan 20 10:10:12 np0005588919 nova_compute[225855]: </secret>
Jan 20 10:10:12 np0005588919 nova_compute[225855]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.644 225859 DEBUG nova.virt.libvirt.vif [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1030995833',display_name='tempest-TestVolumeBootPattern-server-1030995833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1030995833',id=173,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-wer4a7li',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:10:04Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=9e852872-788c-4dac-b7fb-d76d67e7a84f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.645 225859 DEBUG nova.network.os_vif_util [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.646 225859 DEBUG nova.network.os_vif_util [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.648 225859 DEBUG nova.objects.instance [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e852872-788c-4dac-b7fb-d76d67e7a84f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.664 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <uuid>9e852872-788c-4dac-b7fb-d76d67e7a84f</uuid>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <name>instance-000000ad</name>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestVolumeBootPattern-server-1030995833</nova:name>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:10:11</nova:creationTime>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:user uuid="bf422e55e158420cbdae75f07a3bb97a">tempest-TestVolumeBootPattern-194644003-project-member</nova:user>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:project uuid="a49638950e1543fa8e0d251af5479623">tempest-TestVolumeBootPattern-194644003</nova:project>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <nova:port uuid="f9f19cf7-87f5-4dd4-a7be-78086c84e176">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="serial">9e852872-788c-4dac-b7fb-d76d67e7a84f</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="uuid">9e852872-788c-4dac-b7fb-d76d67e7a84f</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-ccb7c984-4606-40ef-8fcd-a902f5382dee">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <serial>ccb7c984-4606-40ef-8fcd-a902f5382dee</serial>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <encryption format="luks">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:        <secret type="passphrase" uuid="4b9c7f6b-c79e-4c34-82d7-409b96a12d36"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      </encryption>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:19:e2:73"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <target dev="tapf9f19cf7-87"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/console.log" append="off"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:10:12 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:10:12 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:10:12 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:10:12 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.665 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Preparing to wait for external event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.666 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.666 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.667 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.668 225859 DEBUG nova.virt.libvirt.vif [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1030995833',display_name='tempest-TestVolumeBootPattern-server-1030995833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1030995833',id=173,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-wer4a7li',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:10:04Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=9e852872-788c-4dac-b7fb-d76d67e7a84f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.668 225859 DEBUG nova.network.os_vif_util [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.669 225859 DEBUG nova.network.os_vif_util [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.670 225859 DEBUG os_vif [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.671 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.672 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.673 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.677 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.677 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9f19cf7-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.678 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9f19cf7-87, col_values=(('external_ids', {'iface-id': 'f9f19cf7-87f5-4dd4-a7be-78086c84e176', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:e2:73', 'vm-uuid': '9e852872-788c-4dac-b7fb-d76d67e7a84f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.679 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 NetworkManager[49104]: <info>  [1768921812.6807] manager: (tapf9f19cf7-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.682 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.687 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.689 225859 INFO os_vif [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87')#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.748 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.749 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.749 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No VIF found with MAC fa:16:3e:19:e2:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.750 225859 INFO nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Using config drive#033[00m
Jan 20 10:10:12 np0005588919 nova_compute[225855]: 2026-01-20 15:10:12.774 225859 DEBUG nova.storage.rbd_utils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:12.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.305 225859 INFO nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Creating config drive at /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config#033[00m
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.310 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd1zh_4aw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.442 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd1zh_4aw" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.468 225859 DEBUG nova.storage.rbd_utils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.472 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config 9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.613 225859 DEBUG oslo_concurrency.processutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config 9e852872-788c-4dac-b7fb-d76d67e7a84f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.614 225859 INFO nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Deleting local config drive /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f/disk.config because it was imported into RBD.#033[00m
Jan 20 10:10:13 np0005588919 kernel: tapf9f19cf7-87: entered promiscuous mode
Jan 20 10:10:13 np0005588919 NetworkManager[49104]: <info>  [1768921813.6601] manager: (tapf9f19cf7-87): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 20 10:10:13 np0005588919 systemd-udevd[296968]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:10:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:13Z|00769|binding|INFO|Claiming lport f9f19cf7-87f5-4dd4-a7be-78086c84e176 for this chassis.
Jan 20 10:10:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:13Z|00770|binding|INFO|f9f19cf7-87f5-4dd4-a7be-78086c84e176: Claiming fa:16:3e:19:e2:73 10.100.0.12
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.699 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:13 np0005588919 NetworkManager[49104]: <info>  [1768921813.7115] device (tapf9f19cf7-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:10:13 np0005588919 NetworkManager[49104]: <info>  [1768921813.7137] device (tapf9f19cf7-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:10:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:13Z|00771|binding|INFO|Setting lport f9f19cf7-87f5-4dd4-a7be-78086c84e176 ovn-installed in OVS
Jan 20 10:10:13 np0005588919 nova_compute[225855]: 2026-01-20 15:10:13.715 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:13 np0005588919 systemd-machined[194361]: New machine qemu-89-instance-000000ad.
Jan 20 10:10:13 np0005588919 systemd[1]: Started Virtual Machine qemu-89-instance-000000ad.
Jan 20 10:10:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:13Z|00772|binding|INFO|Setting lport f9f19cf7-87f5-4dd4-a7be-78086c84e176 up in Southbound
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.749 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:e2:73 10.100.0.12'], port_security=['fa:16:3e:19:e2:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e852872-788c-4dac-b7fb-d76d67e7a84f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64f52c7d-befd-4095-889b-e7a5da6821d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f9f19cf7-87f5-4dd4-a7be-78086c84e176) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.750 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f9f19cf7-87f5-4dd4-a7be-78086c84e176 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 bound to our chassis#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.751 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b677f1a9-dbaa-4373-8466-bd9ccf067b91#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.764 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb3b639-50d8-4ee4-aac7-e74e76438a9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.764 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb677f1a9-d1 in ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.766 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb677f1a9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.766 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fd589ad8-cee1-49a0-baea-0348e6efe766]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.767 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea31feb-6c2f-4a8f-b73f-4d54411572ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.783 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[648403fd-7ac4-4ebc-9eb4-90c1b81b25e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.794 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b75b9b5-fff9-4dd2-aa00-c396a8e4738e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.822 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[a67a248e-9d3a-49f1-ab46-53e62a9655f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.828 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c00eeb1f-77d2-4fc4-abb8-0d9996e52977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 NetworkManager[49104]: <info>  [1768921813.8292] manager: (tapb677f1a9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/319)
Jan 20 10:10:13 np0005588919 systemd-udevd[296972]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.864 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1cac5e26-a394-4a77-8793-96a3cd91a4f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.867 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a84ada-a5ca-4b32-be0c-92f6a00201d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 NetworkManager[49104]: <info>  [1768921813.8981] device (tapb677f1a9-d0): carrier: link connected
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.905 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f6062976-0f69-4e7f-a00f-d848803994de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.923 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b60ecf1b-dd25-455e-aaf5-e5507d4d1c76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685282, 'reachable_time': 24586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297004, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.947 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0513ea1e-5a64-494e-b028-44432b484b4f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:c834'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685282, 'tstamp': 685282}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297005, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.964 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[71cf3823-7806-46bb-8f4e-1ca9f3ef3f73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685282, 'reachable_time': 24586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297006, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:13.998 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe80284-8492-43e6-b90f-bb6468743db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.054 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c59d8ce4-9a0c-4d2f-92f4-e787c3673c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.055 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.056 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.056 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb677f1a9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:14 np0005588919 NetworkManager[49104]: <info>  [1768921814.0589] manager: (tapb677f1a9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 20 10:10:14 np0005588919 kernel: tapb677f1a9-d0: entered promiscuous mode
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.062 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb677f1a9-d0, col_values=(('external_ids', {'iface-id': '1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:14 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:14Z|00773|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.092 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.092 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b9ad93-b59f-456d-b3f4-229244a24080]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.093 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:10:14 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:14.094 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'env', 'PROCESS_TAG=haproxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b677f1a9-dbaa-4373-8466-bd9ccf067b91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.427 225859 DEBUG nova.network.neutron [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Updated VIF entry in instance network info cache for port f9f19cf7-87f5-4dd4-a7be-78086c84e176. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.428 225859 DEBUG nova.network.neutron [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Updating instance_info_cache with network_info: [{"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:14 np0005588919 podman[297038]: 2026-01-20 15:10:14.456306393 +0000 UTC m=+0.048616560 container create e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.463 225859 DEBUG oslo_concurrency.lockutils [req-9aea7350-b3f2-4184-b469-1f5945fa5981 req-aebb9bb6-8011-4901-b9c6-b66e7b6fb655 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-9e852872-788c-4dac-b7fb-d76d67e7a84f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:10:14 np0005588919 systemd[1]: Started libpod-conmon-e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44.scope.
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.523 225859 DEBUG nova.compute.manager [req-e86389a5-6022-4d4d-a058-d05b998b41e1 req-a43351fe-e447-49a3-9f7c-37752c1dd16e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.523 225859 DEBUG oslo_concurrency.lockutils [req-e86389a5-6022-4d4d-a058-d05b998b41e1 req-a43351fe-e447-49a3-9f7c-37752c1dd16e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.523 225859 DEBUG oslo_concurrency.lockutils [req-e86389a5-6022-4d4d-a058-d05b998b41e1 req-a43351fe-e447-49a3-9f7c-37752c1dd16e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.524 225859 DEBUG oslo_concurrency.lockutils [req-e86389a5-6022-4d4d-a058-d05b998b41e1 req-a43351fe-e447-49a3-9f7c-37752c1dd16e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:14 np0005588919 nova_compute[225855]: 2026-01-20 15:10:14.524 225859 DEBUG nova.compute.manager [req-e86389a5-6022-4d4d-a058-d05b998b41e1 req-a43351fe-e447-49a3-9f7c-37752c1dd16e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Processing event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:10:14 np0005588919 podman[297038]: 2026-01-20 15:10:14.42977661 +0000 UTC m=+0.022086797 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:10:14 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:10:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5075690fd2baf5ce67a4e697a92358c36b0bceb39f08aa2bba2b013ccbee9916/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:10:14 np0005588919 podman[297038]: 2026-01-20 15:10:14.544356341 +0000 UTC m=+0.136666528 container init e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:10:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:14.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:14 np0005588919 podman[297038]: 2026-01-20 15:10:14.551485904 +0000 UTC m=+0.143796071 container start e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:10:14 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [NOTICE]   (297093) : New worker (297095) forked
Jan 20 10:10:14 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [NOTICE]   (297093) : Loading success.
Jan 20 10:10:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:14.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:16.429 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:16.430 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:16.431 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:16.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.616 225859 DEBUG nova.compute.manager [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.616 225859 DEBUG oslo_concurrency.lockutils [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.617 225859 DEBUG oslo_concurrency.lockutils [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.617 225859 DEBUG oslo_concurrency.lockutils [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.617 225859 DEBUG nova.compute.manager [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] No waiting events found dispatching network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.617 225859 WARNING nova.compute.manager [req-b4be0a5d-d2d4-463a-882b-65187d28574a req-35613700-c0f8-4356-903c-5d673b4011a1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received unexpected event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.839 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.840 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921816.8385212, 9e852872-788c-4dac-b7fb-d76d67e7a84f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.840 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] VM Started (Lifecycle Event)#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.844 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.848 225859 INFO nova.virt.libvirt.driver [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Instance spawned successfully.#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.848 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.864 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.870 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.873 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.874 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.874 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.874 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.875 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.875 225859 DEBUG nova.virt.libvirt.driver [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.898 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.898 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921816.8387506, 9e852872-788c-4dac-b7fb-d76d67e7a84f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.898 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.930 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.933 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921816.8429008, 9e852872-788c-4dac-b7fb-d76d67e7a84f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.933 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:10:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:16.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.954 225859 INFO nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Took 11.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.955 225859 DEBUG nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.962 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:16 np0005588919 nova_compute[225855]: 2026-01-20 15:10:16.965 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:10:17 np0005588919 nova_compute[225855]: 2026-01-20 15:10:17.015 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:10:17 np0005588919 nova_compute[225855]: 2026-01-20 15:10:17.041 225859 INFO nova.compute.manager [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Took 13.59 seconds to build instance.#033[00m
Jan 20 10:10:17 np0005588919 nova_compute[225855]: 2026-01-20 15:10:17.057 225859 DEBUG oslo_concurrency.lockutils [None req-05c30a27-224b-486c-a6a3-1a1f679cb609 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:17 np0005588919 nova_compute[225855]: 2026-01-20 15:10:17.560 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:17 np0005588919 nova_compute[225855]: 2026-01-20 15:10:17.680 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.699501) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817699596, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 1924, "num_deletes": 255, "total_data_size": 4164916, "memory_usage": 4222112, "flush_reason": "Manual Compaction"}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817729569, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 2722827, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 61512, "largest_seqno": 63431, "table_properties": {"data_size": 2714791, "index_size": 4786, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17713, "raw_average_key_size": 20, "raw_value_size": 2698424, "raw_average_value_size": 3182, "num_data_blocks": 207, "num_entries": 848, "num_filter_entries": 848, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921681, "oldest_key_time": 1768921681, "file_creation_time": 1768921817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 30232 microseconds, and 9516 cpu microseconds.
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.729731) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 2722827 bytes OK
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.729790) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.735201) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.735222) EVENT_LOG_v1 {"time_micros": 1768921817735215, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.735248) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 4156125, prev total WAL file size 4156125, number of live WAL files 2.
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.737126) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(2659KB)], [123(10MB)]
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817737231, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 14183796, "oldest_snapshot_seqno": -1}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8788 keys, 12255515 bytes, temperature: kUnknown
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817867344, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 12255515, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12196741, "index_size": 35686, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 230322, "raw_average_key_size": 26, "raw_value_size": 12040134, "raw_average_value_size": 1370, "num_data_blocks": 1372, "num_entries": 8788, "num_filter_entries": 8788, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.867565) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 12255515 bytes
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.869141) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.0 rd, 94.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 10.9 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(9.7) write-amplify(4.5) OK, records in: 9317, records dropped: 529 output_compression: NoCompression
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.869160) EVENT_LOG_v1 {"time_micros": 1768921817869151, "job": 78, "event": "compaction_finished", "compaction_time_micros": 130160, "compaction_time_cpu_micros": 56595, "output_level": 6, "num_output_files": 1, "total_output_size": 12255515, "num_input_records": 9317, "num_output_records": 8788, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817869669, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921817871661, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.736994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.871718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.871724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.871728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.871729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:17 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:10:17.871730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:10:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:18.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:10:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:18.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:19 np0005588919 podman[297188]: 2026-01-20 15:10:19.171809808 +0000 UTC m=+0.060780876 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.301 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.302 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.303 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.303 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.304 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.307 225859 INFO nova.compute.manager [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Terminating instance#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.309 225859 DEBUG nova.compute.manager [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:10:20 np0005588919 kernel: tapf9f19cf7-87 (unregistering): left promiscuous mode
Jan 20 10:10:20 np0005588919 NetworkManager[49104]: <info>  [1768921820.3572] device (tapf9f19cf7-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.377 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:20Z|00774|binding|INFO|Releasing lport f9f19cf7-87f5-4dd4-a7be-78086c84e176 from this chassis (sb_readonly=0)
Jan 20 10:10:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:20Z|00775|binding|INFO|Setting lport f9f19cf7-87f5-4dd4-a7be-78086c84e176 down in Southbound
Jan 20 10:10:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:20Z|00776|binding|INFO|Removing iface tapf9f19cf7-87 ovn-installed in OVS
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.395 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Jan 20 10:10:20 np0005588919 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ad.scope: Consumed 3.515s CPU time.
Jan 20 10:10:20 np0005588919 systemd-machined[194361]: Machine qemu-89-instance-000000ad terminated.
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.550 225859 INFO nova.virt.libvirt.driver [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Instance destroyed successfully.#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.551 225859 DEBUG nova.objects.instance [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'resources' on Instance uuid 9e852872-788c-4dac-b7fb-d76d67e7a84f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.557 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:e2:73 10.100.0.12'], port_security=['fa:16:3e:19:e2:73 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e852872-788c-4dac-b7fb-d76d67e7a84f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64f52c7d-befd-4095-889b-e7a5da6821d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=f9f19cf7-87f5-4dd4-a7be-78086c84e176) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:20.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.560 140354 INFO neutron.agent.ovn.metadata.agent [-] Port f9f19cf7-87f5-4dd4-a7be-78086c84e176 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 unbound from our chassis#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.563 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b677f1a9-dbaa-4373-8466-bd9ccf067b91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.564 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ef2aa7-206f-43c1-a0c5-12f10b6053e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.565 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace which is not needed anymore#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.573 225859 DEBUG nova.virt.libvirt.vif [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1030995833',display_name='tempest-TestVolumeBootPattern-server-1030995833',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1030995833',id=173,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:10:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-wer4a7li',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:10:17Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=9e852872-788c-4dac-b7fb-d76d67e7a84f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.573 225859 DEBUG nova.network.os_vif_util [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "address": "fa:16:3e:19:e2:73", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9f19cf7-87", "ovs_interfaceid": "f9f19cf7-87f5-4dd4-a7be-78086c84e176", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.574 225859 DEBUG nova.network.os_vif_util [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.574 225859 DEBUG os_vif [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.575 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.576 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9f19cf7-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.577 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.582 225859 INFO os_vif [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e2:73,bridge_name='br-int',has_traffic_filtering=True,id=f9f19cf7-87f5-4dd4-a7be-78086c84e176,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9f19cf7-87')#033[00m
Jan 20 10:10:20 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [NOTICE]   (297093) : haproxy version is 2.8.14-c23fe91
Jan 20 10:10:20 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [NOTICE]   (297093) : path to executable is /usr/sbin/haproxy
Jan 20 10:10:20 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [WARNING]  (297093) : Exiting Master process...
Jan 20 10:10:20 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [ALERT]    (297093) : Current worker (297095) exited with code 143 (Terminated)
Jan 20 10:10:20 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297086]: [WARNING]  (297093) : All workers exited. Exiting... (0)
Jan 20 10:10:20 np0005588919 systemd[1]: libpod-e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44.scope: Deactivated successfully.
Jan 20 10:10:20 np0005588919 podman[297286]: 2026-01-20 15:10:20.702345515 +0000 UTC m=+0.046690206 container died e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:10:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44-userdata-shm.mount: Deactivated successfully.
Jan 20 10:10:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay-5075690fd2baf5ce67a4e697a92358c36b0bceb39f08aa2bba2b013ccbee9916-merged.mount: Deactivated successfully.
Jan 20 10:10:20 np0005588919 podman[297286]: 2026-01-20 15:10:20.746783285 +0000 UTC m=+0.091127986 container cleanup e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:10:20 np0005588919 systemd[1]: libpod-conmon-e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44.scope: Deactivated successfully.
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.784 225859 INFO nova.virt.libvirt.driver [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Deleting instance files /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f_del#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.784 225859 INFO nova.virt.libvirt.driver [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Deletion of /var/lib/nova/instances/9e852872-788c-4dac-b7fb-d76d67e7a84f_del complete#033[00m
Jan 20 10:10:20 np0005588919 podman[297316]: 2026-01-20 15:10:20.824643075 +0000 UTC m=+0.052077069 container remove e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.831 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5f51e4-7959-4749-82ac-2d53fe53fded]: (4, ('Tue Jan 20 03:10:20 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44)\ne8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44\nTue Jan 20 03:10:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (e8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44)\ne8bf3f1f69d8a9dca045ab143829c5417120c28e2a4cd71c921ad01e76c46c44\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.834 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[29326437-2cfe-4572-8710-05fcd73372de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.836 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.839 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 kernel: tapb677f1a9-d0: left promiscuous mode
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.845 225859 INFO nova.compute.manager [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.846 225859 DEBUG oslo.service.loopingcall [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.846 225859 DEBUG nova.compute.manager [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.847 225859 DEBUG nova.network.neutron [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:10:20 np0005588919 nova_compute[225855]: 2026-01-20 15:10:20.873 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.877 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0018992a-3a19-4d93-9488-c6a4c21f2fb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.892 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5f276d48-d873-40a2-9426-f08db75106d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.894 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[08daf50e-0818-4053-a455-19cd617b7ecf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.911 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[939fc0e3-2764-4915-b5f5-5c16e3e7bcc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685274, 'reachable_time': 16108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297332, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.915 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:10:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:20.915 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[13dd8f21-7c30-4330-87e5-b3b37bc5192a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:20 np0005588919 systemd[1]: run-netns-ovnmeta\x2db677f1a9\x2ddbaa\x2d4373\x2d8466\x2dbd9ccf067b91.mount: Deactivated successfully.
Jan 20 10:10:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:20.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.422 225859 DEBUG nova.compute.manager [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-unplugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.423 225859 DEBUG oslo_concurrency.lockutils [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.423 225859 DEBUG oslo_concurrency.lockutils [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.423 225859 DEBUG oslo_concurrency.lockutils [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.423 225859 DEBUG nova.compute.manager [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] No waiting events found dispatching network-vif-unplugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:21 np0005588919 nova_compute[225855]: 2026-01-20 15:10:21.424 225859 DEBUG nova.compute.manager [req-2644b226-28b8-490d-8f46-1cf0348f71b0 req-38e0afe8-395e-494f-a9d0-c9823ea60c96 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-unplugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.194 225859 DEBUG nova.network.neutron [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.214 225859 INFO nova.compute.manager [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Took 1.37 seconds to deallocate network for instance.#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.271 225859 DEBUG nova.compute.manager [req-0f262437-f374-4145-91c4-2127196770d5 req-1eff2cd7-d76c-4f77-9e7c-a5ea724225e7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-deleted-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.434 225859 INFO nova.compute.manager [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Took 0.22 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.500 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.501 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:22.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.562 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:22 np0005588919 nova_compute[225855]: 2026-01-20 15:10:22.596 225859 DEBUG oslo_concurrency.processutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:22.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3238774180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.032 225859 DEBUG oslo_concurrency.processutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.039 225859 DEBUG nova.compute.provider_tree [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.052 225859 DEBUG nova.scheduler.client.report [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.072 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.105 225859 INFO nova.scheduler.client.report [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Deleted allocations for instance 9e852872-788c-4dac-b7fb-d76d67e7a84f#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.204 225859 DEBUG oslo_concurrency.lockutils [None req-14350567-d534-4580-8413-135ad632139b bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.520 225859 DEBUG nova.compute.manager [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.521 225859 DEBUG oslo_concurrency.lockutils [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.522 225859 DEBUG oslo_concurrency.lockutils [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.522 225859 DEBUG oslo_concurrency.lockutils [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "9e852872-788c-4dac-b7fb-d76d67e7a84f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.522 225859 DEBUG nova.compute.manager [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] No waiting events found dispatching network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:23 np0005588919 nova_compute[225855]: 2026-01-20 15:10:23.523 225859 WARNING nova.compute.manager [req-79cfc599-0d4a-4499-87a2-babd7747216a req-6ddfa326-023e-4447-8a77-114f8d6f6873 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Received unexpected event network-vif-plugged-f9f19cf7-87f5-4dd4-a7be-78086c84e176 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:10:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:10:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4078764243' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:10:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:10:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4078764243' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:10:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:24.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:25 np0005588919 nova_compute[225855]: 2026-01-20 15:10:25.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:26.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:26.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:27 np0005588919 nova_compute[225855]: 2026-01-20 15:10:27.564 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:28.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:28.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:30.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:30 np0005588919 nova_compute[225855]: 2026-01-20 15:10:30.581 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:30.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:32 np0005588919 nova_compute[225855]: 2026-01-20 15:10:32.566 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:32.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:32.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:34.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:34.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:35 np0005588919 nova_compute[225855]: 2026-01-20 15:10:35.548 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921820.5471656, 9e852872-788c-4dac-b7fb-d76d67e7a84f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:35 np0005588919 nova_compute[225855]: 2026-01-20 15:10:35.548 225859 INFO nova.compute.manager [-] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:10:35 np0005588919 nova_compute[225855]: 2026-01-20 15:10:35.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:35 np0005588919 nova_compute[225855]: 2026-01-20 15:10:35.998 225859 DEBUG nova.compute.manager [None req-cf1c000e-3324-4782-b19d-01b8759efea7 - - - - - -] [instance: 9e852872-788c-4dac-b7fb-d76d67e7a84f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:36 np0005588919 podman[297363]: 2026-01-20 15:10:36.055123017 +0000 UTC m=+0.093128834 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:10:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:36.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:10:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1348110601' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:10:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:10:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1348110601' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:10:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:36.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.569 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.696 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.696 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.696 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.697 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.697 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.698 225859 INFO nova.compute.manager [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Terminating instance#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.699 225859 DEBUG nova.compute.manager [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:10:37 np0005588919 kernel: tap6b7cb043-d1 (unregistering): left promiscuous mode
Jan 20 10:10:37 np0005588919 NetworkManager[49104]: <info>  [1768921837.7646] device (tap6b7cb043-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:37Z|00777|binding|INFO|Releasing lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 from this chassis (sb_readonly=0)
Jan 20 10:10:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:37Z|00778|binding|INFO|Setting lport 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 down in Southbound
Jan 20 10:10:37 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:37Z|00779|binding|INFO|Removing iface tap6b7cb043-d1 ovn-installed in OVS
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.778 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.812 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Jan 20 10:10:37 np0005588919 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a8.scope: Consumed 19.034s CPU time.
Jan 20 10:10:37 np0005588919 systemd-machined[194361]: Machine qemu-86-instance-000000a8 terminated.
Jan 20 10:10:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:37.833 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:9e:90 10.100.0.6'], port_security=['fa:16:3e:bf:9e:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'a25af5a3-096f-4363-842e-d960c22eb16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3967ae21-1590-4685-8881-8bd1bcf25258', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc74c4a296554866969b05aef75252af', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd56d9c6d-4bb5-4a73-ab07-9e0ee1fd3b93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89ced88f-b1ed-4329-8a53-1931e6b0e3e9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=6b7cb043-d1f4-4c2b-8173-1e3e2a664767) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:37.834 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 6b7cb043-d1f4-4c2b-8173-1e3e2a664767 in datapath 3967ae21-1590-4685-8881-8bd1bcf25258 unbound from our chassis#033[00m
Jan 20 10:10:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:37.836 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3967ae21-1590-4685-8881-8bd1bcf25258, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:10:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:37.837 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6e792391-8bf1-484a-8dc7-6375c7345ea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:37.838 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 namespace which is not needed anymore#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.935 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.944 225859 INFO nova.virt.libvirt.driver [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Instance destroyed successfully.#033[00m
Jan 20 10:10:37 np0005588919 nova_compute[225855]: 2026-01-20 15:10:37.944 225859 DEBUG nova.objects.instance [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lazy-loading 'resources' on Instance uuid a25af5a3-096f-4363-842e-d960c22eb16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [NOTICE]   (294872) : haproxy version is 2.8.14-c23fe91
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [NOTICE]   (294872) : path to executable is /usr/sbin/haproxy
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [WARNING]  (294872) : Exiting Master process...
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [WARNING]  (294872) : Exiting Master process...
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [ALERT]    (294872) : Current worker (294884) exited with code 143 (Terminated)
Jan 20 10:10:37 np0005588919 neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258[294842]: [WARNING]  (294872) : All workers exited. Exiting... (0)
Jan 20 10:10:38 np0005588919 systemd[1]: libpod-d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e.scope: Deactivated successfully.
Jan 20 10:10:38 np0005588919 podman[297424]: 2026-01-20 15:10:38.00708267 +0000 UTC m=+0.046196982 container died d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.023 225859 DEBUG nova.virt.libvirt.vif [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:07:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-950743647',display_name='tempest-ServerRescueNegativeTestJSON-server-950743647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-950743647',id=168,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:08:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fc74c4a296554866969b05aef75252af',ramdisk_id='',reservation_id='r-gkbc59sh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1649662639',owner_user_name='tempest-ServerRescueNegativeTestJSON-1649662639-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:08:28Z,user_data=None,user_id='27658864f96d453586dd0846a4c55b7d',uuid=a25af5a3-096f-4363-842e-d960c22eb16b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.024 225859 DEBUG nova.network.os_vif_util [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converting VIF {"id": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "address": "fa:16:3e:bf:9e:90", "network": {"id": "3967ae21-1590-4685-8881-8bd1bcf25258", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-285441107-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fc74c4a296554866969b05aef75252af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b7cb043-d1", "ovs_interfaceid": "6b7cb043-d1f4-4c2b-8173-1e3e2a664767", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.025 225859 DEBUG nova.network.os_vif_util [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.026 225859 DEBUG os_vif [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.028 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.028 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b7cb043-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.031 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.033 225859 INFO os_vif [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9e:90,bridge_name='br-int',has_traffic_filtering=True,id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767,network=Network(3967ae21-1590-4685-8881-8bd1bcf25258),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b7cb043-d1')#033[00m
Jan 20 10:10:38 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e-userdata-shm.mount: Deactivated successfully.
Jan 20 10:10:38 np0005588919 systemd[1]: var-lib-containers-storage-overlay-6a0a29c0cfe0897e5a9faa037b73161c66d2ed59c5ab8259820c45f49ae9bf20-merged.mount: Deactivated successfully.
Jan 20 10:10:38 np0005588919 podman[297424]: 2026-01-20 15:10:38.046644053 +0000 UTC m=+0.085758365 container cleanup d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.062 225859 DEBUG nova.compute.manager [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.062 225859 DEBUG oslo_concurrency.lockutils [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.062 225859 DEBUG oslo_concurrency.lockutils [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.062 225859 DEBUG oslo_concurrency.lockutils [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.063 225859 DEBUG nova.compute.manager [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.063 225859 DEBUG nova.compute.manager [req-73cfee4f-3676-421f-bf11-c472da662cb3 req-6e07b650-6541-4f53-9bbb-c4bc3915bf08 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-unplugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:10:38 np0005588919 systemd[1]: libpod-conmon-d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e.scope: Deactivated successfully.
Jan 20 10:10:38 np0005588919 podman[297465]: 2026-01-20 15:10:38.115562088 +0000 UTC m=+0.046864641 container remove d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.121 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4d48a0-6f0d-4572-a7f3-5619be614d23]: (4, ('Tue Jan 20 03:10:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e)\nd749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e\nTue Jan 20 03:10:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 (d749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e)\nd749b2135e8b112af5d3a20cbeb705a74120f88ef0dbe175ef347b92349abb8e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.122 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[caf51314-1d46-40e9-943a-a4518560755a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.123 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3967ae21-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:38 np0005588919 kernel: tap3967ae21-10: left promiscuous mode
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.142 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7915caf9-26f1-4841-ad08-39071b452225]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.162 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b0efc4-609b-48d4-99e2-8b31656e2ddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.163 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[74065a5e-d960-4237-b91c-28bb970ada94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.182 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8904880d-ee4e-4985-9980-aa002876114f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674630, 'reachable_time': 40173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297489, 'error': None, 'target': 'ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.184 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3967ae21-1590-4685-8881-8bd1bcf25258 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:10:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:38.184 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d65f5794-a4b9-4244-ab48-fc1fb8cea321]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:38 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3967ae21\x2d1590\x2d4685\x2d8881\x2d8bd1bcf25258.mount: Deactivated successfully.
Jan 20 10:10:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:38.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.694 225859 INFO nova.virt.libvirt.driver [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Deleting instance files /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b_del#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.695 225859 INFO nova.virt.libvirt.driver [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Deletion of /var/lib/nova/instances/a25af5a3-096f-4363-842e-d960c22eb16b_del complete#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.921 225859 INFO nova.compute.manager [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Took 1.22 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.922 225859 DEBUG oslo.service.loopingcall [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.923 225859 DEBUG nova.compute.manager [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:10:38 np0005588919 nova_compute[225855]: 2026-01-20 15:10:38.923 225859 DEBUG nova.network.neutron [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:10:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:38.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.143 225859 DEBUG nova.network.neutron [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.314 225859 DEBUG nova.compute.manager [req-c0103c11-0e8c-4b62-b8f6-4bd604ccce3a req-2c673116-4bc2-4995-ab14-d2613f9686bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-deleted-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.314 225859 INFO nova.compute.manager [req-c0103c11-0e8c-4b62-b8f6-4bd604ccce3a req-2c673116-4bc2-4995-ab14-d2613f9686bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Neutron deleted interface 6b7cb043-d1f4-4c2b-8173-1e3e2a664767; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.315 225859 DEBUG nova.network.neutron [req-c0103c11-0e8c-4b62-b8f6-4bd604ccce3a req-2c673116-4bc2-4995-ab14-d2613f9686bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.316 225859 DEBUG nova.compute.manager [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.317 225859 DEBUG oslo_concurrency.lockutils [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.317 225859 DEBUG oslo_concurrency.lockutils [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.317 225859 DEBUG oslo_concurrency.lockutils [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.317 225859 DEBUG nova.compute.manager [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] No waiting events found dispatching network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.317 225859 WARNING nova.compute.manager [req-eae4afd1-6747-4efe-9781-16f726358106 req-7c214a29-be3a-482a-afbd-db300295c926 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Received unexpected event network-vif-plugged-6b7cb043-d1f4-4c2b-8173-1e3e2a664767 for instance with vm_state rescued and task_state deleting.#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.321 225859 INFO nova.compute.manager [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Took 1.40 seconds to deallocate network for instance.#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.359 225859 DEBUG nova.compute.manager [req-c0103c11-0e8c-4b62-b8f6-4bd604ccce3a req-2c673116-4bc2-4995-ab14-d2613f9686bc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Detach interface failed, port_id=6b7cb043-d1f4-4c2b-8173-1e3e2a664767, reason: Instance a25af5a3-096f-4363-842e-d960c22eb16b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 20 10:10:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:40.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.659 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.659 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:40 np0005588919 nova_compute[225855]: 2026-01-20 15:10:40.701 225859 DEBUG oslo_concurrency.processutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:40.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1780300540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.163 225859 DEBUG oslo_concurrency.processutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.169 225859 DEBUG nova.compute.provider_tree [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.302 225859 DEBUG nova.scheduler.client.report [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.357 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.434 225859 INFO nova.scheduler.client.report [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Deleted allocations for instance a25af5a3-096f-4363-842e-d960c22eb16b#033[00m
Jan 20 10:10:41 np0005588919 nova_compute[225855]: 2026-01-20 15:10:41.544 225859 DEBUG oslo_concurrency.lockutils [None req-671ee076-b046-4e19-829f-98b4be8b1947 27658864f96d453586dd0846a4c55b7d fc74c4a296554866969b05aef75252af - - default default] Lock "a25af5a3-096f-4363-842e-d960c22eb16b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:42 np0005588919 nova_compute[225855]: 2026-01-20 15:10:42.572 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:42.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:42.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.031 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.626 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.627 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.644 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.711 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.712 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.719 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.719 225859 INFO nova.compute.claims [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:10:43 np0005588919 nova_compute[225855]: 2026-01-20 15:10:43.807 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1303016254' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.269 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.277 225859 DEBUG nova.compute.provider_tree [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.300 225859 DEBUG nova.scheduler.client.report [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.335 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.336 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.404 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.405 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.425 225859 INFO nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.447 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.496 225859 INFO nova.virt.block_device [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Booting with volume snapshot 33f09854-2fbb-41ab-84ee-a1c4a1274b2b at /dev/vda#033[00m
Jan 20 10:10:44 np0005588919 nova_compute[225855]: 2026-01-20 15:10:44.580 225859 DEBUG nova.policy [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf422e55e158420cbdae75f07a3bb97a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a49638950e1543fa8e0d251af5479623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:10:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:44.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:44.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:45 np0005588919 nova_compute[225855]: 2026-01-20 15:10:45.825 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Successfully created port: 1c883167-abba-4a7e-af5c-33a54aba9ce0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:10:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:46.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:46.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.199 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Successfully updated port: 1c883167-abba-4a7e-af5c-33a54aba9ce0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.222 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.222 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquired lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.222 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.298 225859 DEBUG nova.compute.manager [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-changed-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.298 225859 DEBUG nova.compute.manager [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Refreshing instance network info cache due to event network-changed-1c883167-abba-4a7e-af5c-33a54aba9ce0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.299 225859 DEBUG oslo_concurrency.lockutils [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.442 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:10:47 np0005588919 nova_compute[225855]: 2026-01-20 15:10:47.574 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:48 np0005588919 nova_compute[225855]: 2026-01-20 15:10:48.033 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:48.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:48.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.013 225859 DEBUG os_brick.utils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.014 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.036 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.036 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7ca7b8-1040-46a2-8259-dff386738023]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.038 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.047 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.048 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb3c032-1387-42ce-91b6-a4172ebbde65]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.049 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.063 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.064 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[84dfb702-a52d-4db6-8035-d6efc95196ac]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.066 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9ed2f5-c176-419e-9aa1-758d10fb1bb1]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.066 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.107 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "nvme version" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.110 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.110 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.110 225859 DEBUG os_brick.initiator.connectors.lightos [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.111 225859 DEBUG os_brick.utils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] <== get_connector_properties: return (97ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.111 225859 DEBUG nova.virt.block_device [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updating existing volume attachment record: 9aa44915-ba45-4054-be03-16dd5a3b8ca4 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.406 225859 DEBUG nova.network.neutron [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updating instance_info_cache with network_info: [{"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.437 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Releasing lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.438 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Instance network_info: |[{"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.439 225859 DEBUG oslo_concurrency.lockutils [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:10:49 np0005588919 nova_compute[225855]: 2026-01-20 15:10:49.440 225859 DEBUG nova.network.neutron [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Refreshing network info cache for port 1c883167-abba-4a7e-af5c-33a54aba9ce0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:10:50 np0005588919 podman[297598]: 2026-01-20 15:10:50.020192734 +0000 UTC m=+0.064187762 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.112 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.116 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.116 225859 INFO nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Creating image(s)#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.117 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.118 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Ensure instance console log exists: /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.118 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.119 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.120 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.125 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Start _get_guest_xml network_info=[{"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-f396a213-a7f4-434e-a290-c5d9278be4af', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'f396a213-a7f4-434e-a290-c5d9278be4af', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8', 'attached_at': '', 'detached_at': '', 'volume_id': 'f396a213-a7f4-434e-a290-c5d9278be4af', 'serial': 'f396a213-a7f4-434e-a290-c5d9278be4af'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': '9aa44915-ba45-4054-be03-16dd5a3b8ca4', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.133 225859 WARNING nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.139 225859 DEBUG nova.virt.libvirt.host [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.140 225859 DEBUG nova.virt.libvirt.host [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.144 225859 DEBUG nova.virt.libvirt.host [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.145 225859 DEBUG nova.virt.libvirt.host [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.147 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.148 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.148 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.149 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.149 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.150 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.150 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.150 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.151 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.151 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.152 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.152 225859 DEBUG nova.virt.hardware [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.190 225859 DEBUG nova.storage.rbd_utils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.195 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.560 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:50.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.641 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.667 225859 DEBUG nova.virt.libvirt.vif [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-589796538',display_name='tempest-TestVolumeBootPattern-server-589796538',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-589796538',id=174,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-sgiqpjjh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:10:44Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.668 225859 DEBUG nova.network.os_vif_util [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.669 225859 DEBUG nova.network.os_vif_util [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.670 225859 DEBUG nova.objects.instance [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'pci_devices' on Instance uuid 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.686 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <uuid>35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8</uuid>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <name>instance-000000ae</name>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestVolumeBootPattern-server-589796538</nova:name>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:10:50</nova:creationTime>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:user uuid="bf422e55e158420cbdae75f07a3bb97a">tempest-TestVolumeBootPattern-194644003-project-member</nova:user>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:project uuid="a49638950e1543fa8e0d251af5479623">tempest-TestVolumeBootPattern-194644003</nova:project>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <nova:port uuid="1c883167-abba-4a7e-af5c-33a54aba9ce0">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="serial">35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="uuid">35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-f396a213-a7f4-434e-a290-c5d9278be4af">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <serial>f396a213-a7f4-434e-a290-c5d9278be4af</serial>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:51:9b:64"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <target dev="tap1c883167-ab"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/console.log" append="off"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:10:50 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:10:50 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:10:50 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:10:50 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.687 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Preparing to wait for external event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.687 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.688 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.688 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.689 225859 DEBUG nova.virt.libvirt.vif [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-589796538',display_name='tempest-TestVolumeBootPattern-server-589796538',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-589796538',id=174,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-sgiqpjjh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:10:44Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.689 225859 DEBUG nova.network.os_vif_util [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.689 225859 DEBUG nova.network.os_vif_util [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.690 225859 DEBUG os_vif [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.690 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.691 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.691 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.693 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.693 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c883167-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.694 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c883167-ab, col_values=(('external_ids', {'iface-id': '1c883167-abba-4a7e-af5c-33a54aba9ce0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:9b:64', 'vm-uuid': '35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.738 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 NetworkManager[49104]: <info>  [1768921850.7394] manager: (tap1c883167-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.742 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.744 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.745 225859 INFO os_vif [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab')#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.801 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.802 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.802 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No VIF found with MAC fa:16:3e:51:9b:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.802 225859 INFO nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Using config drive#033[00m
Jan 20 10:10:50 np0005588919 nova_compute[225855]: 2026-01-20 15:10:50.831 225859 DEBUG nova.storage.rbd_utils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:10:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:50.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.394 225859 INFO nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Creating config drive at /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.399 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplo4v1tx5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.435 225859 DEBUG nova.network.neutron [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updated VIF entry in instance network info cache for port 1c883167-abba-4a7e-af5c-33a54aba9ce0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.436 225859 DEBUG nova.network.neutron [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updating instance_info_cache with network_info: [{"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.510 225859 DEBUG oslo_concurrency.lockutils [req-b31ee416-8317-4348-93ef-231b53011c14 req-74b23f27-0e5f-4934-81c3-0cc7fa2c9700 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.532 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplo4v1tx5" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.575 225859 DEBUG nova.storage.rbd_utils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.579 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.719 225859 DEBUG oslo_concurrency.processutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.720 225859 INFO nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Deleting local config drive /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8/disk.config because it was imported into RBD.#033[00m
Jan 20 10:10:51 np0005588919 kernel: tap1c883167-ab: entered promiscuous mode
Jan 20 10:10:51 np0005588919 NetworkManager[49104]: <info>  [1768921851.7763] manager: (tap1c883167-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/322)
Jan 20 10:10:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:51Z|00780|binding|INFO|Claiming lport 1c883167-abba-4a7e-af5c-33a54aba9ce0 for this chassis.
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.823 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:51Z|00781|binding|INFO|1c883167-abba-4a7e-af5c-33a54aba9ce0: Claiming fa:16:3e:51:9b:64 10.100.0.10
Jan 20 10:10:51 np0005588919 systemd-udevd[297732]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:10:51 np0005588919 systemd-machined[194361]: New machine qemu-90-instance-000000ae.
Jan 20 10:10:51 np0005588919 NetworkManager[49104]: <info>  [1768921851.8703] device (tap1c883167-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:10:51 np0005588919 NetworkManager[49104]: <info>  [1768921851.8713] device (tap1c883167-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.901 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:9b:64 10.100.0.10'], port_security=['fa:16:3e:51:9b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64f52c7d-befd-4095-889b-e7a5da6821d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=1c883167-abba-4a7e-af5c-33a54aba9ce0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.902 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 1c883167-abba-4a7e-af5c-33a54aba9ce0 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 bound to our chassis#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.904 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b677f1a9-dbaa-4373-8466-bd9ccf067b91#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.918 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[25c0feb9-8909-46f3-b08a-900eb33bf4f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.919 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb677f1a9-d1 in ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.922 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb677f1a9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.922 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f36d4e22-bea2-4869-995e-4bc4ee357e9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.923 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99af607a-1450-494e-82a3-ff1c1dd90f8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 systemd[1]: Started Virtual Machine qemu-90-instance-000000ae.
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.944 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[fbddcc52-f651-4847-ba54-d739b9ccfce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.944 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:51Z|00782|binding|INFO|Setting lport 1c883167-abba-4a7e-af5c-33a54aba9ce0 ovn-installed in OVS
Jan 20 10:10:51 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:51Z|00783|binding|INFO|Setting lport 1c883167-abba-4a7e-af5c-33a54aba9ce0 up in Southbound
Jan 20 10:10:51 np0005588919 nova_compute[225855]: 2026-01-20 15:10:51.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.957 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6b350d30-6569-4996-9bbf-d6bc15817f47]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.990 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d384bd71-827d-4fc9-a55e-2b1f1d8477a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:51.995 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dee5c0-cf4e-4dc4-b975-8a317508bb9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:51 np0005588919 NetworkManager[49104]: <info>  [1768921851.9970] manager: (tapb677f1a9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/323)
Jan 20 10:10:51 np0005588919 systemd-udevd[297734]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.030 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[dd64a5f9-b2bb-41c3-a197-3d8360e8e4d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.033 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[49eb7d28-289c-4c3c-b101-3537b828e1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 NetworkManager[49104]: <info>  [1768921852.0614] device (tapb677f1a9-d0): carrier: link connected
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.068 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[31035fae-715a-4802-9406-7c01712d2da8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.086 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[05808dca-a29b-4891-be57-286813d928db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689098, 'reachable_time': 17931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297765, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.103 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a88bcc7d-4eb5-4f85-af74-b05a3bc64f08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:c834'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 689098, 'tstamp': 689098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297766, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.120 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b1af4f82-da03-4613-8ff3-530caaf1322a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689098, 'reachable_time': 17931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297767, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.152 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[86852fcf-a0bc-4652-8d57-a85a716887d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.207 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[67b8dfa6-896a-4dfe-adf4-e07d57c1d44e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.208 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.208 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.208 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb677f1a9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588919 NetworkManager[49104]: <info>  [1768921852.2110] manager: (tapb677f1a9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 20 10:10:52 np0005588919 kernel: tapb677f1a9-d0: entered promiscuous mode
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.212 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb677f1a9-d0, col_values=(('external_ids', {'iface-id': '1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:52Z|00784|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.214 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.215 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.215 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[56ea8e9b-8845-42da-b0cd-d8af9c6e2834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.216 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:10:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:52.217 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'env', 'PROCESS_TAG=haproxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b677f1a9-dbaa-4373-8466-bd9ccf067b91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.349 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921852.3489478, 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.349 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] VM Started (Lifecycle Event)#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.396 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.400 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921852.3491285, 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.400 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.430 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.432 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.571 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.576 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:52 np0005588919 podman[297841]: 2026-01-20 15:10:52.587078655 +0000 UTC m=+0.047815537 container create ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:10:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:52.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:52 np0005588919 systemd[1]: Started libpod-conmon-ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2.scope.
Jan 20 10:10:52 np0005588919 podman[297841]: 2026-01-20 15:10:52.561759937 +0000 UTC m=+0.022496849 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:10:52 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:10:52 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19daef63f9380ca40ee278346b76985622bb726e2f8a3577bdcd88908f79189d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:10:52 np0005588919 podman[297841]: 2026-01-20 15:10:52.6781677 +0000 UTC m=+0.138904582 container init ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:10:52 np0005588919 podman[297841]: 2026-01-20 15:10:52.683001007 +0000 UTC m=+0.143737889 container start ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:10:52 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [NOTICE]   (297860) : New worker (297862) forked
Jan 20 10:10:52 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [NOTICE]   (297860) : Loading success.
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.943 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921837.9421668, a25af5a3-096f-4363-842e-d960c22eb16b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:52 np0005588919 nova_compute[225855]: 2026-01-20 15:10:52.943 225859 INFO nova.compute.manager [-] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:10:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:52.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.009 225859 DEBUG nova.compute.manager [None req-40c3535a-6194-443f-922c-56d9d908b990 - - - - - -] [instance: a25af5a3-096f-4363-842e-d960c22eb16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.098 225859 DEBUG nova.compute.manager [req-d80ac688-df37-4bf3-995e-3bafd623c984 req-f4758ff9-d384-4d37-bbf1-c3f26c972322 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.099 225859 DEBUG oslo_concurrency.lockutils [req-d80ac688-df37-4bf3-995e-3bafd623c984 req-f4758ff9-d384-4d37-bbf1-c3f26c972322 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.099 225859 DEBUG oslo_concurrency.lockutils [req-d80ac688-df37-4bf3-995e-3bafd623c984 req-f4758ff9-d384-4d37-bbf1-c3f26c972322 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.100 225859 DEBUG oslo_concurrency.lockutils [req-d80ac688-df37-4bf3-995e-3bafd623c984 req-f4758ff9-d384-4d37-bbf1-c3f26c972322 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.100 225859 DEBUG nova.compute.manager [req-d80ac688-df37-4bf3-995e-3bafd623c984 req-f4758ff9-d384-4d37-bbf1-c3f26c972322 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Processing event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.100 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.104 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921853.1046283, 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.105 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.106 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.110 225859 INFO nova.virt.libvirt.driver [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Instance spawned successfully.#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.110 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.122 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.130 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.135 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.135 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.136 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.136 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.137 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.137 225859 DEBUG nova.virt.libvirt.driver [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.162 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.194 225859 INFO nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Took 3.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.194 225859 DEBUG nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.260 225859 INFO nova.compute.manager [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Took 9.57 seconds to build instance.#033[00m
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.277 225859 DEBUG oslo_concurrency.lockutils [None req-bd00b6c0-2377-4e1e-be36-f79351d98531 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:53 np0005588919 nova_compute[225855]: 2026-01-20 15:10:53.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.575 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.575 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.575 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:10:54 np0005588919 nova_compute[225855]: 2026-01-20 15:10:54.575 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:54.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:54.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.280 225859 DEBUG nova.compute.manager [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.281 225859 DEBUG oslo_concurrency.lockutils [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.282 225859 DEBUG oslo_concurrency.lockutils [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.282 225859 DEBUG oslo_concurrency.lockutils [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.283 225859 DEBUG nova.compute.manager [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] No waiting events found dispatching network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.283 225859 WARNING nova.compute.manager [req-c7bc8d05-784c-403b-b96e-2f34b2e0aeed req-8dcf7686-c2b8-4529-8a43-f470d751b122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received unexpected event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.741 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.852 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.853 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.853 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.854 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.855 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.857 225859 INFO nova.compute.manager [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Terminating instance#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.860 225859 DEBUG nova.compute.manager [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:10:55 np0005588919 kernel: tap1c883167-ab (unregistering): left promiscuous mode
Jan 20 10:10:55 np0005588919 NetworkManager[49104]: <info>  [1768921855.9079] device (tap1c883167-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:10:55 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:55Z|00785|binding|INFO|Releasing lport 1c883167-abba-4a7e-af5c-33a54aba9ce0 from this chassis (sb_readonly=0)
Jan 20 10:10:55 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:55Z|00786|binding|INFO|Setting lport 1c883167-abba-4a7e-af5c-33a54aba9ce0 down in Southbound
Jan 20 10:10:55 np0005588919 ovn_controller[130490]: 2026-01-20T15:10:55Z|00787|binding|INFO|Removing iface tap1c883167-ab ovn-installed in OVS
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.920 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:55.934 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:9b:64 10.100.0.10'], port_security=['fa:16:3e:51:9b:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64f52c7d-befd-4095-889b-e7a5da6821d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=1c883167-abba-4a7e-af5c-33a54aba9ce0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:10:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:55.937 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 1c883167-abba-4a7e-af5c-33a54aba9ce0 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 unbound from our chassis#033[00m
Jan 20 10:10:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:55.940 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b677f1a9-dbaa-4373-8466-bd9ccf067b91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:10:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:55.941 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cad7d707-aa60-4d5a-a21c-93f62be8ae1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.942 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updating instance_info_cache with network_info: [{"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:55.942 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace which is not needed anymore#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.961 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.962 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.962 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:55 np0005588919 nova_compute[225855]: 2026-01-20 15:10:55.962 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:55 np0005588919 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Jan 20 10:10:55 np0005588919 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ae.scope: Consumed 3.386s CPU time.
Jan 20 10:10:55 np0005588919 systemd-machined[194361]: Machine qemu-90-instance-000000ae terminated.
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.100 225859 INFO nova.virt.libvirt.driver [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Instance destroyed successfully.#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.100 225859 DEBUG nova.objects.instance [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'resources' on Instance uuid 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:10:56 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [NOTICE]   (297860) : haproxy version is 2.8.14-c23fe91
Jan 20 10:10:56 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [NOTICE]   (297860) : path to executable is /usr/sbin/haproxy
Jan 20 10:10:56 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [WARNING]  (297860) : Exiting Master process...
Jan 20 10:10:56 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [ALERT]    (297860) : Current worker (297862) exited with code 143 (Terminated)
Jan 20 10:10:56 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[297856]: [WARNING]  (297860) : All workers exited. Exiting... (0)
Jan 20 10:10:56 np0005588919 systemd[1]: libpod-ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2.scope: Deactivated successfully.
Jan 20 10:10:56 np0005588919 podman[297897]: 2026-01-20 15:10:56.116208369 +0000 UTC m=+0.042514797 container died ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.120 225859 DEBUG nova.virt.libvirt.vif [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:10:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-589796538',display_name='tempest-TestVolumeBootPattern-server-589796538',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-589796538',id=174,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:10:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-sgiqpjjh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:10:53Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.121 225859 DEBUG nova.network.os_vif_util [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "address": "fa:16:3e:51:9b:64", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c883167-ab", "ovs_interfaceid": "1c883167-abba-4a7e-af5c-33a54aba9ce0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.122 225859 DEBUG nova.network.os_vif_util [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.122 225859 DEBUG os_vif [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.124 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c883167-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.128 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.130 225859 INFO os_vif [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=1c883167-abba-4a7e-af5c-33a54aba9ce0,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c883167-ab')#033[00m
Jan 20 10:10:56 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2-userdata-shm.mount: Deactivated successfully.
Jan 20 10:10:56 np0005588919 systemd[1]: var-lib-containers-storage-overlay-19daef63f9380ca40ee278346b76985622bb726e2f8a3577bdcd88908f79189d-merged.mount: Deactivated successfully.
Jan 20 10:10:56 np0005588919 podman[297897]: 2026-01-20 15:10:56.151989654 +0000 UTC m=+0.078296082 container cleanup ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:10:56 np0005588919 systemd[1]: libpod-conmon-ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2.scope: Deactivated successfully.
Jan 20 10:10:56 np0005588919 podman[297952]: 2026-01-20 15:10:56.215168997 +0000 UTC m=+0.043120855 container remove ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.220 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4f95a62b-7df5-434c-94ce-5bdc6e72cbe2]: (4, ('Tue Jan 20 03:10:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2)\nce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2\nTue Jan 20 03:10:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (ce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2)\nce93107e97f2afc68739b0e4240ca20f65ba11abe81e5b38fd289de5c6365fb2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.222 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[063b198f-b987-4884-8ea1-46e9962a9b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.223 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:10:56 np0005588919 kernel: tapb677f1a9-d0: left promiscuous mode
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.237 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.240 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[73236760-c55e-4ec4-876e-b62927ff67bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.252 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c6db4696-6ec4-4d2c-b651-4ee4ae60cb2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.253 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8b47aff7-215a-4310-a75c-faf24828f642]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.266 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2029f044-45fe-40c3-b52d-d9fdcb2cfba0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 689091, 'reachable_time': 23553, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297972, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.268 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:10:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:10:56.268 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[822a9a71-63ff-44e1-9f39-96ba1571aeb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:10:56 np0005588919 systemd[1]: run-netns-ovnmeta\x2db677f1a9\x2ddbaa\x2d4373\x2d8466\x2dbd9ccf067b91.mount: Deactivated successfully.
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.316 225859 INFO nova.virt.libvirt.driver [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Deleting instance files /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_del#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.317 225859 INFO nova.virt.libvirt.driver [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Deletion of /var/lib/nova/instances/35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8_del complete#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.519 225859 INFO nova.compute.manager [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Took 0.66 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.520 225859 DEBUG oslo.service.loopingcall [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.520 225859 DEBUG nova.compute.manager [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:10:56 np0005588919 nova_compute[225855]: 2026-01-20 15:10:56.520 225859 DEBUG nova.network.neutron [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:10:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:10:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:56.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:10:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:56.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.068 225859 DEBUG nova.network.neutron [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.093 225859 INFO nova.compute.manager [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Took 0.57 seconds to deallocate network for instance.#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.158 225859 DEBUG nova.compute.manager [req-349a0ed1-33b9-4394-8000-4456a480c885 req-b729d9d1-9999-46a2-bf95-1a87e72129b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-deleted-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.307 225859 INFO nova.compute.manager [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Took 0.21 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.309 225859 DEBUG nova.compute.manager [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Deleting volume: f396a213-a7f4-434e-a290-c5d9278be4af _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.408 225859 DEBUG nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-unplugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.408 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.409 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.409 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.409 225859 DEBUG nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] No waiting events found dispatching network-vif-unplugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.409 225859 DEBUG nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-unplugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 DEBUG nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 DEBUG oslo_concurrency.lockutils [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 DEBUG nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] No waiting events found dispatching network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.410 225859 WARNING nova.compute.manager [req-ea74f5c4-40f5-41fe-b400-24613d2dcf43 req-ecc41560-41fc-47b7-8696-e6865b24c08e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Received unexpected event network-vif-plugged-1c883167-abba-4a7e-af5c-33a54aba9ce0 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.723 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.724 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:57 np0005588919 nova_compute[225855]: 2026-01-20 15:10:57.913 225859 DEBUG oslo_concurrency.processutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:10:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2257383779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.364 225859 DEBUG oslo_concurrency.processutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.370 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.373 225859 DEBUG nova.compute.provider_tree [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.388 225859 DEBUG nova.scheduler.client.report [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.411 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.414 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.414 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.414 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.415 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.468 225859 INFO nova.scheduler.client.report [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Deleted allocations for instance 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.556 225859 DEBUG oslo_concurrency.lockutils [None req-b2f8c823-cde5-4dd1-bd8a-b06b9cd3b121 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:10:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:10:58.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4107587614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.841 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.990 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:10:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:10:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:10:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:10:58.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.993 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4299MB free_disk=20.94251251220703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.993 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:10:58 np0005588919 nova_compute[225855]: 2026-01-20 15:10:58.994 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.117 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.117 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.141 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:10:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:10:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1208327764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.590 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.599 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.616 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.642 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:10:59 np0005588919 nova_compute[225855]: 2026-01-20 15:10:59.642 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 20 10:11:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:00.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:00.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:01 np0005588919 nova_compute[225855]: 2026-01-20 15:11:01.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:11:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1603287857' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:11:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:11:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1603287857' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:11:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:02.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:02 np0005588919 nova_compute[225855]: 2026-01-20 15:11:02.618 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:02 np0005588919 nova_compute[225855]: 2026-01-20 15:11:02.637 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:02 np0005588919 nova_compute[225855]: 2026-01-20 15:11:02.638 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:02.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:04.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:04.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:06 np0005588919 nova_compute[225855]: 2026-01-20 15:11:06.127 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 20 10:11:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:06.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:06.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:07 np0005588919 podman[298096]: 2026-01-20 15:11:07.067914048 +0000 UTC m=+0.113544753 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.355 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.355 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.380 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.471 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.471 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.482 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.483 225859 INFO nova.compute.claims [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.607 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:07 np0005588919 nova_compute[225855]: 2026-01-20 15:11:07.643 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/932732216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.070 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.076 225859 DEBUG nova.compute.provider_tree [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.105 225859 DEBUG nova.scheduler.client.report [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.123 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.124 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.177 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.178 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.198 225859 INFO nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.220 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:11:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.341 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.343 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.344 225859 INFO nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Creating image(s)#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.384 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.426 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.454 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.457 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.493 225859 DEBUG nova.policy [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17140eb73c0b4236807367396cc4959b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eac67fc3f12d4e9f9e47de6b79eea88f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.550 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.551 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.551 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.552 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.576 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.580 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b661b8a2-8bea-46be-afe4-537fd2523387_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:08.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.843 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b661b8a2-8bea-46be-afe4-537fd2523387_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:08 np0005588919 nova_compute[225855]: 2026-01-20 15:11:08.927 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] resizing rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:11:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:09.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.037 225859 DEBUG nova.objects.instance [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lazy-loading 'migration_context' on Instance uuid b661b8a2-8bea-46be-afe4-537fd2523387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.098 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.098 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Ensure instance console log exists: /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.099 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.100 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.100 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:09.221 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:09 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:09.224 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:11:09 np0005588919 nova_compute[225855]: 2026-01-20 15:11:09.407 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Successfully created port: 7af44da8-4a04-4981-9a59-e94e346d071e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.301 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Successfully updated port: 7af44da8-4a04-4981-9a59-e94e346d071e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.316 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.317 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquired lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.317 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.421 225859 DEBUG nova.compute.manager [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-changed-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.421 225859 DEBUG nova.compute.manager [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Refreshing instance network info cache due to event network-changed-7af44da8-4a04-4981-9a59-e94e346d071e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.421 225859 DEBUG oslo_concurrency.lockutils [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:10 np0005588919 nova_compute[225855]: 2026-01-20 15:11:10.534 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:11:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:10.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:11.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.098 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921856.0979578, 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.099 225859 INFO nova.compute.manager [-] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.119 225859 DEBUG nova.compute.manager [None req-9f2fa34b-086d-491d-af98-71f582a9441f - - - - - -] [instance: 35ba4fcf-baa1-45a5-bf4c-4f5cb96653a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.129 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:11.227 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.485 225859 DEBUG nova.network.neutron [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Updating instance_info_cache with network_info: [{"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.505 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Releasing lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.505 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Instance network_info: |[{"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.505 225859 DEBUG oslo_concurrency.lockutils [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.506 225859 DEBUG nova.network.neutron [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Refreshing network info cache for port 7af44da8-4a04-4981-9a59-e94e346d071e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.508 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Start _get_guest_xml network_info=[{"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.513 225859 WARNING nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.517 225859 DEBUG nova.virt.libvirt.host [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.517 225859 DEBUG nova.virt.libvirt.host [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.522 225859 DEBUG nova.virt.libvirt.host [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.522 225859 DEBUG nova.virt.libvirt.host [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.523 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.524 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.524 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.524 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.525 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.525 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.525 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.525 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.526 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.526 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.526 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.526 225859 DEBUG nova.virt.hardware [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:11:11 np0005588919 nova_compute[225855]: 2026-01-20 15:11:11.529 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1314462438' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.000 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.024 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.028 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2034628683' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.500 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.502 225859 DEBUG nova.virt.libvirt.vif [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:11:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1893768580',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1893768580',id=175,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eac67fc3f12d4e9f9e47de6b79eea88f',ramdisk_id='',reservation_id='r-wkdfou8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-232429935',owner_user_name='tempest-ServerTagsTestJSON-232429935-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:11:08Z,user_data=None,user_id='17140eb73c0b4236807367396cc4959b',uuid=b661b8a2-8bea-46be-afe4-537fd2523387,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.502 225859 DEBUG nova.network.os_vif_util [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converting VIF {"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.503 225859 DEBUG nova.network.os_vif_util [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.504 225859 DEBUG nova.objects.instance [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lazy-loading 'pci_devices' on Instance uuid b661b8a2-8bea-46be-afe4-537fd2523387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.525 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <uuid>b661b8a2-8bea-46be-afe4-537fd2523387</uuid>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <name>instance-000000af</name>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerTagsTestJSON-server-1893768580</nova:name>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:11:11</nova:creationTime>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:user uuid="17140eb73c0b4236807367396cc4959b">tempest-ServerTagsTestJSON-232429935-project-member</nova:user>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:project uuid="eac67fc3f12d4e9f9e47de6b79eea88f">tempest-ServerTagsTestJSON-232429935</nova:project>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <nova:port uuid="7af44da8-4a04-4981-9a59-e94e346d071e">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="serial">b661b8a2-8bea-46be-afe4-537fd2523387</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="uuid">b661b8a2-8bea-46be-afe4-537fd2523387</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b661b8a2-8bea-46be-afe4-537fd2523387_disk">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b661b8a2-8bea-46be-afe4-537fd2523387_disk.config">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a5:4b:49"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <target dev="tap7af44da8-4a"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/console.log" append="off"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:11:12 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:11:12 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:11:12 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:11:12 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.526 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Preparing to wait for external event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.526 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.526 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.527 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.527 225859 DEBUG nova.virt.libvirt.vif [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:11:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1893768580',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1893768580',id=175,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eac67fc3f12d4e9f9e47de6b79eea88f',ramdisk_id='',reservation_id='r-wkdfou8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-232429935',owner_user_name='tempest-ServerTagsTestJSON-232429935-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:11:08Z,user_data=None,user_id='17140eb73c0b4236807367396cc4959b',uuid=b661b8a2-8bea-46be-afe4-537fd2523387,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.527 225859 DEBUG nova.network.os_vif_util [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converting VIF {"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.528 225859 DEBUG nova.network.os_vif_util [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.528 225859 DEBUG os_vif [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.529 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.529 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.530 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.532 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.533 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7af44da8-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.533 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7af44da8-4a, col_values=(('external_ids', {'iface-id': '7af44da8-4a04-4981-9a59-e94e346d071e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:4b:49', 'vm-uuid': 'b661b8a2-8bea-46be-afe4-537fd2523387'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.535 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:12 np0005588919 NetworkManager[49104]: <info>  [1768921872.5362] manager: (tap7af44da8-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.538 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.544 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.545 225859 INFO os_vif [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a')#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.617 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.618 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.618 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] No VIF found with MAC fa:16:3e:a5:4b:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.619 225859 INFO nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Using config drive#033[00m
Jan 20 10:11:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:12.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.654 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:12 np0005588919 nova_compute[225855]: 2026-01-20 15:11:12.658 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:13.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.116 225859 INFO nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Creating config drive at /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.126 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62q643zy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.159 225859 DEBUG nova.network.neutron [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Updated VIF entry in instance network info cache for port 7af44da8-4a04-4981-9a59-e94e346d071e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.161 225859 DEBUG nova.network.neutron [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Updating instance_info_cache with network_info: [{"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.183 225859 DEBUG oslo_concurrency.lockutils [req-caae48a2-4ec2-4722-b445-33f7aa0d19fc req-80ca1a28-7212-4b44-8074-016c8b5079dd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b661b8a2-8bea-46be-afe4-537fd2523387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.263 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp62q643zy" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.300 225859 DEBUG nova.storage.rbd_utils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] rbd image b661b8a2-8bea-46be-afe4-537fd2523387_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.306 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config b661b8a2-8bea-46be-afe4-537fd2523387_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.485 225859 DEBUG oslo_concurrency.processutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config b661b8a2-8bea-46be-afe4-537fd2523387_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.486 225859 INFO nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Deleting local config drive /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387/disk.config because it was imported into RBD.#033[00m
Jan 20 10:11:13 np0005588919 kernel: tap7af44da8-4a: entered promiscuous mode
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.5351] manager: (tap7af44da8-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Jan 20 10:11:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:13Z|00788|binding|INFO|Claiming lport 7af44da8-4a04-4981-9a59-e94e346d071e for this chassis.
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.536 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:13Z|00789|binding|INFO|7af44da8-4a04-4981-9a59-e94e346d071e: Claiming fa:16:3e:a5:4b:49 10.100.0.8
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.541 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.552 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4b:49 10.100.0.8'], port_security=['fa:16:3e:a5:4b:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b8a2-8bea-46be-afe4-537fd2523387', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eac67fc3f12d4e9f9e47de6b79eea88f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd744446f-90e4-4fa0-a2ba-33abbbe03344', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff7bbda1-70af-44d2-a4e3-9c7dbd69983d, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7af44da8-4a04-4981-9a59-e94e346d071e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.554 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7af44da8-4a04-4981-9a59-e94e346d071e in datapath 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 bound to our chassis#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.556 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98#033[00m
Jan 20 10:11:13 np0005588919 systemd-udevd[298445]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.566 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3aa4b6-fbd9-4680-a550-57700a490455]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.568 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap901fb6c4-d1 in ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.569 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap901fb6c4-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.569 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0167dad4-04db-4436-9963-56b87f42d097]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.570 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb309a56-916e-4426-a6ff-8d7a87be834c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.5762] device (tap7af44da8-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.5781] device (tap7af44da8-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:11:13 np0005588919 systemd-machined[194361]: New machine qemu-91-instance-000000af.
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.583 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[029d9c6f-6cb6-440f-9b4f-02bd84051132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.604 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.605 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbd4643-f09a-4b14-aa27-9aad453a35da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 systemd[1]: Started Virtual Machine qemu-91-instance-000000af.
Jan 20 10:11:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:13Z|00790|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e ovn-installed in OVS
Jan 20 10:11:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:13Z|00791|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e up in Southbound
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.612 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.634 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[93de9b1e-b811-4474-8aeb-bf2f02020aa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 systemd-udevd[298450]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.6408] manager: (tap901fb6c4-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.640 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[373fcd8f-be68-485a-860a-20295029bf82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.668 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4ecec0-fcae-4c39-881f-1d317e73090a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.672 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[54846de8-3321-44e8-a8e3-5cfab4b626db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.6945] device (tap901fb6c4-d0): carrier: link connected
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.701 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c07daa86-6bca-4c08-b901-22949ae35337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.719 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[db9b66ae-4a38-481a-86fb-c8b5f4a45a40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap901fb6c4-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:d6:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691262, 'reachable_time': 16516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298481, 'error': None, 'target': 'ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.735 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c82197-b2fe-42b4-b1f0-482670580025]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:d637'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 691262, 'tstamp': 691262}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298482, 'error': None, 'target': 'ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.754 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[842062b4-bd57-4081-9cef-fe1db793ece8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap901fb6c4-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:d6:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691262, 'reachable_time': 16516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298483, 'error': None, 'target': 'ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.782 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2f1476-8774-4466-8cd4-1f3ce7054507]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.822 225859 DEBUG nova.compute.manager [req-2613c3b2-6f92-470e-a509-0bd5aa207be6 req-5c751510-433f-445b-ba03-96837db9e7a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.822 225859 DEBUG oslo_concurrency.lockutils [req-2613c3b2-6f92-470e-a509-0bd5aa207be6 req-5c751510-433f-445b-ba03-96837db9e7a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.823 225859 DEBUG oslo_concurrency.lockutils [req-2613c3b2-6f92-470e-a509-0bd5aa207be6 req-5c751510-433f-445b-ba03-96837db9e7a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.823 225859 DEBUG oslo_concurrency.lockutils [req-2613c3b2-6f92-470e-a509-0bd5aa207be6 req-5c751510-433f-445b-ba03-96837db9e7a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.823 225859 DEBUG nova.compute.manager [req-2613c3b2-6f92-470e-a509-0bd5aa207be6 req-5c751510-433f-445b-ba03-96837db9e7a5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Processing event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.841 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8e1d83-6dc5-4520-b8a2-7220af6611ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.842 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap901fb6c4-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.843 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.843 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap901fb6c4-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.844 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 NetworkManager[49104]: <info>  [1768921873.8454] manager: (tap901fb6c4-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 20 10:11:13 np0005588919 kernel: tap901fb6c4-d0: entered promiscuous mode
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.846 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.850 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap901fb6c4-d0, col_values=(('external_ids', {'iface-id': '46940fc6-bbdc-4a44-b120-2c8feb7a2a8b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.851 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:13Z|00792|binding|INFO|Releasing lport 46940fc6-bbdc-4a44-b120-2c8feb7a2a8b from this chassis (sb_readonly=0)
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.852 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.854 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/901fb6c4-d3d9-4ccd-b087-e1a254c7cc98.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/901fb6c4-d3d9-4ccd-b087-e1a254c7cc98.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.855 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[72cbbecc-6b10-4ee8-a181-41f145e99a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.857 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/901fb6c4-d3d9-4ccd-b087-e1a254c7cc98.pid.haproxy
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:11:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:13.858 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'env', 'PROCESS_TAG=haproxy-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/901fb6c4-d3d9-4ccd-b087-e1a254c7cc98.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.865 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.974 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.975 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921873.974276, b661b8a2-8bea-46be-afe4-537fd2523387 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.975 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] VM Started (Lifecycle Event)#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.980 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.984 225859 INFO nova.virt.libvirt.driver [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Instance spawned successfully.#033[00m
Jan 20 10:11:13 np0005588919 nova_compute[225855]: 2026-01-20 15:11:13.984 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.016 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.021 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.022 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.023 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.023 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.024 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.024 225859 DEBUG nova.virt.libvirt.driver [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.031 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.062 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.063 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921873.97832, b661b8a2-8bea-46be-afe4-537fd2523387 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.063 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.263 225859 INFO nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Took 5.92 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.264 225859 DEBUG nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.266 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.278 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921873.9803827, b661b8a2-8bea-46be-afe4-537fd2523387 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.278 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:11:14 np0005588919 podman[298555]: 2026-01-20 15:11:14.285129205 +0000 UTC m=+0.062528505 container create 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:11:14 np0005588919 podman[298555]: 2026-01-20 15:11:14.246633163 +0000 UTC m=+0.024032483 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:11:14 np0005588919 systemd[1]: Started libpod-conmon-39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b.scope.
Jan 20 10:11:14 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.388 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:14 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64736f063f2c1d26d577945ba7fc6e4bee2fe41a524dd9cef1de3e35037e7f23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.398 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:14 np0005588919 podman[298555]: 2026-01-20 15:11:14.403251777 +0000 UTC m=+0.180651107 container init 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:11:14 np0005588919 podman[298555]: 2026-01-20 15:11:14.410434441 +0000 UTC m=+0.187833741 container start 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.426 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.428 225859 INFO nova.compute.manager [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Took 7.00 seconds to build instance.#033[00m
Jan 20 10:11:14 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [NOTICE]   (298574) : New worker (298576) forked
Jan 20 10:11:14 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [NOTICE]   (298574) : Loading success.
Jan 20 10:11:14 np0005588919 nova_compute[225855]: 2026-01-20 15:11:14.451 225859 DEBUG oslo_concurrency.lockutils [None req-e06e1530-9323-4750-87da-ce51787654ba 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:14.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:15.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:11:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292567124' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:11:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:11:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292567124' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.910 225859 DEBUG nova.compute.manager [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.910 225859 DEBUG oslo_concurrency.lockutils [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.910 225859 DEBUG oslo_concurrency.lockutils [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.911 225859 DEBUG oslo_concurrency.lockutils [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.911 225859 DEBUG nova.compute.manager [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:15 np0005588919 nova_compute[225855]: 2026-01-20 15:11:15.911 225859 WARNING nova.compute.manager [req-c9d531f8-7d03-45b4-af5a-f18c988143bf req-77d546d8-d984-40a1-98ff-8b4fd81587e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:16.430 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:16.431 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:16.432 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:16.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:17.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:17 np0005588919 nova_compute[225855]: 2026-01-20 15:11:17.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:17 np0005588919 nova_compute[225855]: 2026-01-20 15:11:17.536 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:17 np0005588919 nova_compute[225855]: 2026-01-20 15:11:17.623 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:18.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:19.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:11:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:11:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.781 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.781 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.781 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.782 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.782 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.783 225859 INFO nova.compute.manager [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Terminating instance#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.783 225859 DEBUG nova.compute.manager [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:11:19 np0005588919 kernel: tap7af44da8-4a (unregistering): left promiscuous mode
Jan 20 10:11:19 np0005588919 NetworkManager[49104]: <info>  [1768921879.8321] device (tap7af44da8-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:11:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:19Z|00793|binding|INFO|Releasing lport 7af44da8-4a04-4981-9a59-e94e346d071e from this chassis (sb_readonly=0)
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.847 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:19Z|00794|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e down in Southbound
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.849 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:19Z|00795|binding|INFO|Removing iface tap7af44da8-4a ovn-installed in OVS
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.850 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:19.855 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4b:49 10.100.0.8'], port_security=['fa:16:3e:a5:4b:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b8a2-8bea-46be-afe4-537fd2523387', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eac67fc3f12d4e9f9e47de6b79eea88f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd744446f-90e4-4fa0-a2ba-33abbbe03344', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff7bbda1-70af-44d2-a4e3-9c7dbd69983d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7af44da8-4a04-4981-9a59-e94e346d071e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:19.856 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7af44da8-4a04-4981-9a59-e94e346d071e in datapath 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 unbound from our chassis#033[00m
Jan 20 10:11:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:19.857 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:11:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:19.858 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[50f78dcd-c7d8-42e5-996f-c41239078f1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:19 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:19.859 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 namespace which is not needed anymore#033[00m
Jan 20 10:11:19 np0005588919 nova_compute[225855]: 2026-01-20 15:11:19.867 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:19 np0005588919 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 20 10:11:19 np0005588919 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000af.scope: Consumed 6.393s CPU time.
Jan 20 10:11:19 np0005588919 systemd-machined[194361]: Machine qemu-91-instance-000000af terminated.
Jan 20 10:11:19 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [NOTICE]   (298574) : haproxy version is 2.8.14-c23fe91
Jan 20 10:11:19 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [NOTICE]   (298574) : path to executable is /usr/sbin/haproxy
Jan 20 10:11:19 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [WARNING]  (298574) : Exiting Master process...
Jan 20 10:11:19 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [ALERT]    (298574) : Current worker (298576) exited with code 143 (Terminated)
Jan 20 10:11:19 np0005588919 neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98[298570]: [WARNING]  (298574) : All workers exited. Exiting... (0)
Jan 20 10:11:19 np0005588919 systemd[1]: libpod-39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b.scope: Deactivated successfully.
Jan 20 10:11:19 np0005588919 podman[298795]: 2026-01-20 15:11:19.988452698 +0000 UTC m=+0.044450582 container died 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:11:20 np0005588919 kernel: tap7af44da8-4a: entered promiscuous mode
Jan 20 10:11:20 np0005588919 NetworkManager[49104]: <info>  [1768921880.0079] manager: (tap7af44da8-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 20 10:11:20 np0005588919 systemd-udevd[298774]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:11:20 np0005588919 kernel: tap7af44da8-4a (unregistering): left promiscuous mode
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00796|binding|INFO|Claiming lport 7af44da8-4a04-4981-9a59-e94e346d071e for this chassis.
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00797|binding|INFO|7af44da8-4a04-4981-9a59-e94e346d071e: Claiming fa:16:3e:a5:4b:49 10.100.0.8
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.012 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.018 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4b:49 10.100.0.8'], port_security=['fa:16:3e:a5:4b:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b8a2-8bea-46be-afe4-537fd2523387', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eac67fc3f12d4e9f9e47de6b79eea88f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd744446f-90e4-4fa0-a2ba-33abbbe03344', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff7bbda1-70af-44d2-a4e3-9c7dbd69983d, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7af44da8-4a04-4981-9a59-e94e346d071e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b-userdata-shm.mount: Deactivated successfully.
Jan 20 10:11:20 np0005588919 systemd[1]: var-lib-containers-storage-overlay-64736f063f2c1d26d577945ba7fc6e4bee2fe41a524dd9cef1de3e35037e7f23-merged.mount: Deactivated successfully.
Jan 20 10:11:20 np0005588919 podman[298795]: 2026-01-20 15:11:20.028190656 +0000 UTC m=+0.084188540 container cleanup 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.030 225859 INFO nova.virt.libvirt.driver [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Instance destroyed successfully.#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.031 225859 DEBUG nova.objects.instance [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lazy-loading 'resources' on Instance uuid b661b8a2-8bea-46be-afe4-537fd2523387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00798|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e ovn-installed in OVS
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00799|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e up in Southbound
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00800|binding|INFO|Releasing lport 7af44da8-4a04-4981-9a59-e94e346d071e from this chassis (sb_readonly=1)
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00801|if_status|INFO|Dropped 2 log messages in last 103 seconds (most recently, 103 seconds ago) due to excessive rate
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00802|if_status|INFO|Not setting lport 7af44da8-4a04-4981-9a59-e94e346d071e down as sb is readonly
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00803|binding|INFO|Removing iface tap7af44da8-4a ovn-installed in OVS
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.034 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 systemd[1]: libpod-conmon-39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b.scope: Deactivated successfully.
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00804|binding|INFO|Releasing lport 7af44da8-4a04-4981-9a59-e94e346d071e from this chassis (sb_readonly=0)
Jan 20 10:11:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:11:20Z|00805|binding|INFO|Setting lport 7af44da8-4a04-4981-9a59-e94e346d071e down in Southbound
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.047 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.049 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4b:49 10.100.0.8'], port_security=['fa:16:3e:a5:4b:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b661b8a2-8bea-46be-afe4-537fd2523387', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eac67fc3f12d4e9f9e47de6b79eea88f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd744446f-90e4-4fa0-a2ba-33abbbe03344', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff7bbda1-70af-44d2-a4e3-9c7dbd69983d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=7af44da8-4a04-4981-9a59-e94e346d071e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.050 225859 DEBUG nova.virt.libvirt.vif [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:11:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-1893768580',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servertagstestjson-server-1893768580',id=175,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:11:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eac67fc3f12d4e9f9e47de6b79eea88f',ramdisk_id='',reservation_id='r-wkdfou8l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-232429935',owner_user_name='tempest-ServerTagsTestJSON-232429935-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:11:14Z,user_data=None,user_id='17140eb73c0b4236807367396cc4959b',uuid=b661b8a2-8bea-46be-afe4-537fd2523387,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.051 225859 DEBUG nova.network.os_vif_util [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converting VIF {"id": "7af44da8-4a04-4981-9a59-e94e346d071e", "address": "fa:16:3e:a5:4b:49", "network": {"id": "901fb6c4-d3d9-4ccd-b087-e1a254c7cc98", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1118919703-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eac67fc3f12d4e9f9e47de6b79eea88f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7af44da8-4a", "ovs_interfaceid": "7af44da8-4a04-4981-9a59-e94e346d071e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.052 225859 DEBUG nova.network.os_vif_util [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.052 225859 DEBUG os_vif [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.054 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7af44da8-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.056 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.059 225859 INFO os_vif [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4b:49,bridge_name='br-int',has_traffic_filtering=True,id=7af44da8-4a04-4981-9a59-e94e346d071e,network=Network(901fb6c4-d3d9-4ccd-b087-e1a254c7cc98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7af44da8-4a')#033[00m
Jan 20 10:11:20 np0005588919 podman[298826]: 2026-01-20 15:11:20.111728466 +0000 UTC m=+0.053761807 container remove 39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.116 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d96bf1-9327-4f6c-aa6d-3665317d1422]: (4, ('Tue Jan 20 03:11:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 (39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b)\n39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b\nTue Jan 20 03:11:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 (39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b)\n39c10f61ce58f80611efb9597b63357309abfc9bfb51660b161305b57e340c4b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.118 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b26baaad-270f-430b-a6bf-8458abd57bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.120 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap901fb6c4-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 kernel: tap901fb6c4-d0: left promiscuous mode
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.135 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.136 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[466a7dc3-6841-4a57-a21b-35d24ec98752]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 podman[298827]: 2026-01-20 15:11:20.144727722 +0000 UTC m=+0.079904248 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.152 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34862ac6-1ff7-4815-95e1-0420c8f287f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.153 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[518126f2-93b2-4761-9c4e-8e008b815d29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.169 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[56a79a4c-ddb4-4220-a1da-9934d430ea4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691255, 'reachable_time': 34255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298878, 'error': None, 'target': 'ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.172 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.172 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3abd030f-c874-435b-b42d-34abf6f66f68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 systemd[1]: run-netns-ovnmeta\x2d901fb6c4\x2dd3d9\x2d4ccd\x2db087\x2de1a254c7cc98.mount: Deactivated successfully.
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.173 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7af44da8-4a04-4981-9a59-e94e346d071e in datapath 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 unbound from our chassis#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.175 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.176 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc37c0f2-58ab-4d32-a163-cb185130006e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.176 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 7af44da8-4a04-4981-9a59-e94e346d071e in datapath 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98 unbound from our chassis#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.178 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 901fb6c4-d3d9-4ccd-b087-e1a254c7cc98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:11:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:11:20.178 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78db79e2-0a21-453c-9b3d-4e965c7e0e9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.180 225859 DEBUG nova.compute.manager [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.180 225859 DEBUG oslo_concurrency.lockutils [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.180 225859 DEBUG oslo_concurrency.lockutils [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.181 225859 DEBUG oslo_concurrency.lockutils [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.181 225859 DEBUG nova.compute.manager [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.181 225859 DEBUG nova.compute.manager [req-d5a1d988-97f8-4370-a6a9-5c23ce61f4f0 req-63040d03-16ff-4c43-b3de-1aba204e434e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.431 225859 INFO nova.virt.libvirt.driver [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Deleting instance files /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387_del#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.431 225859 INFO nova.virt.libvirt.driver [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Deletion of /var/lib/nova/instances/b661b8a2-8bea-46be-afe4-537fd2523387_del complete#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.480 225859 INFO nova.compute.manager [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Took 0.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.480 225859 DEBUG oslo.service.loopingcall [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.481 225859 DEBUG nova.compute.manager [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:11:20 np0005588919 nova_compute[225855]: 2026-01-20 15:11:20.481 225859 DEBUG nova.network.neutron [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:11:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:20.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 20 10:11:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:21.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:21 np0005588919 nova_compute[225855]: 2026-01-20 15:11:21.530 225859 DEBUG nova.network.neutron [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:21 np0005588919 nova_compute[225855]: 2026-01-20 15:11:21.545 225859 INFO nova.compute.manager [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Took 1.06 seconds to deallocate network for instance.#033[00m
Jan 20 10:11:21 np0005588919 nova_compute[225855]: 2026-01-20 15:11:21.592 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:21 np0005588919 nova_compute[225855]: 2026-01-20 15:11:21.593 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:21 np0005588919 nova_compute[225855]: 2026-01-20 15:11:21.672 225859 DEBUG oslo_concurrency.processutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1813241616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.120 225859 DEBUG oslo_concurrency.processutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.126 225859 DEBUG nova.compute.provider_tree [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.164 225859 DEBUG nova.scheduler.client.report [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.208 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.248 225859 INFO nova.scheduler.client.report [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Deleted allocations for instance b661b8a2-8bea-46be-afe4-537fd2523387#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.299 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.300 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.300 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.300 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.300 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.300 225859 WARNING nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.301 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.301 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.301 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.301 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.301 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.302 225859 WARNING nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.302 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.302 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.302 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.302 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 WARNING nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.303 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.304 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.304 225859 WARNING nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-unplugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.304 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.304 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.304 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.305 225859 DEBUG oslo_concurrency.lockutils [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.305 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] No waiting events found dispatching network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.305 225859 WARNING nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received unexpected event network-vif-plugged-7af44da8-4a04-4981-9a59-e94e346d071e for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.305 225859 DEBUG nova.compute.manager [req-4d3df9ae-d717-4425-9fae-419135dee5ff req-1deeb906-6935-4fc0-a0cf-d2b33d57ae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Received event network-vif-deleted-7af44da8-4a04-4981-9a59-e94e346d071e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.321 225859 DEBUG oslo_concurrency.lockutils [None req-8519ce3d-f8cd-4ecb-bd73-14d2719b55b8 17140eb73c0b4236807367396cc4959b eac67fc3f12d4e9f9e47de6b79eea88f - - default default] Lock "b661b8a2-8bea-46be-afe4-537fd2523387" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:22 np0005588919 nova_compute[225855]: 2026-01-20 15:11:22.625 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:22.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:23.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:24.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:25 np0005588919 nova_compute[225855]: 2026-01-20 15:11:25.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:11:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 20 10:11:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:26.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:27.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:27 np0005588919 NetworkManager[49104]: <info>  [1768921887.3456] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Jan 20 10:11:27 np0005588919 nova_compute[225855]: 2026-01-20 15:11:27.344 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588919 NetworkManager[49104]: <info>  [1768921887.3463] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 20 10:11:27 np0005588919 nova_compute[225855]: 2026-01-20 15:11:27.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588919 nova_compute[225855]: 2026-01-20 15:11:27.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588919 nova_compute[225855]: 2026-01-20 15:11:27.706 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:27 np0005588919 nova_compute[225855]: 2026-01-20 15:11:27.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:28.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:29.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:30 np0005588919 nova_compute[225855]: 2026-01-20 15:11:30.059 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:30.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:31.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:32.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:32 np0005588919 nova_compute[225855]: 2026-01-20 15:11:32.709 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:33.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.504 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "7a223382-86d1-478e-8324-01ef43aef7e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.504 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.523 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.630 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.630 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.638 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.639 225859 INFO nova.compute.claims [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:11:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:34.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:34 np0005588919 nova_compute[225855]: 2026-01-20 15:11:34.757 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:35.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.029 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921880.0275724, b661b8a2-8bea-46be-afe4-537fd2523387 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.029 225859 INFO nova.compute.manager [-] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.062 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.066 225859 DEBUG nova.compute.manager [None req-d2297132-de63-4433-9602-0dcd0c57feff - - - - - -] [instance: b661b8a2-8bea-46be-afe4-537fd2523387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1087186560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.186 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.191 225859 DEBUG nova.compute.provider_tree [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.218 225859 DEBUG nova.scheduler.client.report [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.239 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.240 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.292 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.308 225859 INFO nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.331 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.441 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.443 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.444 225859 INFO nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating image(s)#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.477 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.505 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.536 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.540 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.621 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.622 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.622 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.622 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.651 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:35 np0005588919 nova_compute[225855]: 2026-01-20 15:11:35.654 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7a223382-86d1-478e-8324-01ef43aef7e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.028 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7a223382-86d1-478e-8324-01ef43aef7e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.125 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] resizing rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.237 225859 DEBUG nova.objects.instance [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'migration_context' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.252 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.253 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Ensure instance console log exists: /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.253 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.253 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.254 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.255 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.259 225859 WARNING nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.265 225859 DEBUG nova.virt.libvirt.host [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.266 225859 DEBUG nova.virt.libvirt.host [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.269 225859 DEBUG nova.virt.libvirt.host [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.269 225859 DEBUG nova.virt.libvirt.host [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.271 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.271 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.271 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.271 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.271 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.272 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.273 225859 DEBUG nova.virt.hardware [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.275 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:36.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.779 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.812 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:36 np0005588919 nova_compute[225855]: 2026-01-20 15:11:36.817 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:37.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3661868601' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.266 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.268 225859 DEBUG nova.objects.instance [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.295 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <uuid>7a223382-86d1-478e-8324-01ef43aef7e1</uuid>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <name>instance-000000b1</name>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerShowV254Test-server-1643419890</nova:name>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:11:36</nova:creationTime>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:user uuid="8e699d2accae4b489d779507db44504e">tempest-ServerShowV254Test-782263999-project-member</nova:user>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <nova:project uuid="37c01d33832740c3ba018515e081285b">tempest-ServerShowV254Test-782263999</nova:project>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="serial">7a223382-86d1-478e-8324-01ef43aef7e1</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="uuid">7a223382-86d1-478e-8324-01ef43aef7e1</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a223382-86d1-478e-8324-01ef43aef7e1_disk">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a223382-86d1-478e-8324-01ef43aef7e1_disk.config">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/console.log" append="off"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:11:37 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:11:37 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:11:37 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:11:37 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.375 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.375 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.376 225859 INFO nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Using config drive#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.406 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:37 np0005588919 podman[299212]: 2026-01-20 15:11:37.419688192 +0000 UTC m=+0.084072276 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.688 225859 INFO nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating config drive at /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.695 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbyxngv9v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.829 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbyxngv9v" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.856 225859 DEBUG nova.storage.rbd_utils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.859 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.998 225859 DEBUG oslo_concurrency.processutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:37 np0005588919 nova_compute[225855]: 2026-01-20 15:11:37.999 225859 INFO nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deleting local config drive /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config because it was imported into RBD.#033[00m
Jan 20 10:11:38 np0005588919 systemd-machined[194361]: New machine qemu-92-instance-000000b1.
Jan 20 10:11:38 np0005588919 systemd[1]: Started Virtual Machine qemu-92-instance-000000b1.
Jan 20 10:11:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.485 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921898.4844081, 7a223382-86d1-478e-8324-01ef43aef7e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.486 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.492 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.492 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.496 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance spawned successfully.#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.496 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.520 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.525 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.528 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.528 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.529 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.529 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.529 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.530 225859 DEBUG nova.virt.libvirt.driver [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.663 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.664 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921898.4855099, 7a223382-86d1-478e-8324-01ef43aef7e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.664 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:11:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:38.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.729 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.733 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.748 225859 INFO nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Took 3.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.749 225859 DEBUG nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.757 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.804 225859 INFO nova.compute.manager [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Took 4.21 seconds to build instance.#033[00m
Jan 20 10:11:38 np0005588919 nova_compute[225855]: 2026-01-20 15:11:38.819 225859 DEBUG oslo_concurrency.lockutils [None req-460bb514-b6ca-4c3a-91b8-5945a3a306a8 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:39.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:40 np0005588919 nova_compute[225855]: 2026-01-20 15:11:40.065 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:40.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:40 np0005588919 nova_compute[225855]: 2026-01-20 15:11:40.751 225859 INFO nova.compute.manager [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Rebuilding instance#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.014 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.029 225859 DEBUG nova.compute.manager [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:41.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.070 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'pci_requests' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.083 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.095 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'resources' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.111 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'migration_context' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.138 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:11:41 np0005588919 nova_compute[225855]: 2026-01-20 15:11:41.141 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:11:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:42.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:42 np0005588919 nova_compute[225855]: 2026-01-20 15:11:42.713 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:43.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:44.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:45.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:45 np0005588919 nova_compute[225855]: 2026-01-20 15:11:45.068 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 20 10:11:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 20 10:11:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:47.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:47 np0005588919 nova_compute[225855]: 2026-01-20 15:11:47.715 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:48.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:49.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:50 np0005588919 nova_compute[225855]: 2026-01-20 15:11:50.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:50.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:51.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:51 np0005588919 podman[299410]: 2026-01-20 15:11:51.050545659 +0000 UTC m=+0.080797214 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 20 10:11:51 np0005588919 nova_compute[225855]: 2026-01-20 15:11:51.184 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:11:51 np0005588919 nova_compute[225855]: 2026-01-20 15:11:51.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:51 np0005588919 nova_compute[225855]: 2026-01-20 15:11:51.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:11:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:52.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:52 np0005588919 nova_compute[225855]: 2026-01-20 15:11:52.717 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:11:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:53.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:11:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:54 np0005588919 nova_compute[225855]: 2026-01-20 15:11:54.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:54 np0005588919 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 20 10:11:54 np0005588919 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b1.scope: Consumed 13.312s CPU time.
Jan 20 10:11:54 np0005588919 systemd-machined[194361]: Machine qemu-92-instance-000000b1 terminated.
Jan 20 10:11:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:54.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:11:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:55.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.205 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance shutdown successfully after 14 seconds.#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.210 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance destroyed successfully.#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.214 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance destroyed successfully.#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.614 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deleting instance files /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1_del#033[00m
Jan 20 10:11:55 np0005588919 nova_compute[225855]: 2026-01-20 15:11:55.615 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deletion of /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1_del complete#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.042 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.043 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating image(s)#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.073 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.110 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.148 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.153 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.225 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.227 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.228 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.228 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.263 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.268 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 7a223382-86d1-478e-8324-01ef43aef7e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.388 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.389 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.390 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.390 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.548 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 7a223382-86d1-478e-8324-01ef43aef7e1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.644 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] resizing rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:11:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:56.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.777 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.779 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Ensure instance console log exists: /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.780 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.781 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.782 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.785 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.790 225859 WARNING nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.799 225859 DEBUG nova.virt.libvirt.host [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.801 225859 DEBUG nova.virt.libvirt.host [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.806 225859 DEBUG nova.virt.libvirt.host [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.807 225859 DEBUG nova.virt.libvirt.host [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.810 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.811 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.812 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.812 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.813 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.813 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.813 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.814 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.814 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.815 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.815 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.815 225859 DEBUG nova.virt.hardware [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.816 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:56 np0005588919 nova_compute[225855]: 2026-01-20 15:11:56.835 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:57.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.095 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:11:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4078018162' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.263 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.298 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.302 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.720 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:11:57 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:11:57 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1131313242' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.804 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.807 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <uuid>7a223382-86d1-478e-8324-01ef43aef7e1</uuid>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <name>instance-000000b1</name>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerShowV254Test-server-1643419890</nova:name>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:11:56</nova:creationTime>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:user uuid="8e699d2accae4b489d779507db44504e">tempest-ServerShowV254Test-782263999-project-member</nova:user>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <nova:project uuid="37c01d33832740c3ba018515e081285b">tempest-ServerShowV254Test-782263999</nova:project>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="serial">7a223382-86d1-478e-8324-01ef43aef7e1</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="uuid">7a223382-86d1-478e-8324-01ef43aef7e1</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a223382-86d1-478e-8324-01ef43aef7e1_disk">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a223382-86d1-478e-8324-01ef43aef7e1_disk.config">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/console.log" append="off"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:11:57 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:11:57 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:11:57 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:11:57 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.873 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.874 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.874 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Using config drive#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.905 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:57 np0005588919 nova_compute[225855]: 2026-01-20 15:11:57.928 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.202 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Creating config drive at /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.206 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfy5mqls execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.340 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyfy5mqls" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.383 225859 DEBUG nova.storage.rbd_utils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] rbd image 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.388 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.581 225859 DEBUG oslo_concurrency.processutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config 7a223382-86d1-478e-8324-01ef43aef7e1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.582 225859 INFO nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deleting local config drive /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1/disk.config because it was imported into RBD.#033[00m
Jan 20 10:11:58 np0005588919 systemd-machined[194361]: New machine qemu-93-instance-000000b1.
Jan 20 10:11:58 np0005588919 systemd[1]: Started Virtual Machine qemu-93-instance-000000b1.
Jan 20 10:11:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:11:58.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.787 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.806 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.807 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:11:58 np0005588919 nova_compute[225855]: 2026-01-20 15:11:58.807 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.039 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 7a223382-86d1-478e-8324-01ef43aef7e1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.040 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921919.038735, 7a223382-86d1-478e-8324-01ef43aef7e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.040 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.042 225859 DEBUG nova.compute.manager [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.043 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.046 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance spawned successfully.#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.046 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:11:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:11:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:11:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:11:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.074 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.080 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.083 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.084 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.084 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.084 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.085 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.085 225859 DEBUG nova.virt.libvirt.driver [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.111 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.112 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921919.0398448, 7a223382-86d1-478e-8324-01ef43aef7e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.112 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] VM Started (Lifecycle Event)#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.152 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.155 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.172 225859 DEBUG nova.compute.manager [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.203 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.231 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.231 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.231 225859 DEBUG nova.objects.instance [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.317 225859 DEBUG oslo_concurrency.lockutils [None req-0086cd3f-1509-4f39-9077-b0a04ab25c5f 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.376 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:11:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:11:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1987985162' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.837 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "7a223382-86d1-478e-8324-01ef43aef7e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.838 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.838 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "7a223382-86d1-478e-8324-01ef43aef7e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.838 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.839 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.841 225859 INFO nova.compute.manager [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Terminating instance#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.842 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.842 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquired lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.842 225859 DEBUG nova.network.neutron [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.844 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.939 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:11:59 np0005588919 nova_compute[225855]: 2026-01-20 15:11:59.940 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.057 225859 DEBUG nova.network.neutron [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.084 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.085 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4228MB free_disk=20.937217712402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.085 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.085 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.177 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 7a223382-86d1-478e-8324-01ef43aef7e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.177 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.178 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.236 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.455 225859 DEBUG nova.network.neutron [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.472 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Releasing lock "refresh_cache-7a223382-86d1-478e-8324-01ef43aef7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.473 225859 DEBUG nova.compute.manager [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:12:00 np0005588919 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 20 10:12:00 np0005588919 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000b1.scope: Consumed 1.930s CPU time.
Jan 20 10:12:00 np0005588919 systemd-machined[194361]: Machine qemu-93-instance-000000b1 terminated.
Jan 20 10:12:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/86498293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.659 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.664 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.684 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.692 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance destroyed successfully.#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.692 225859 DEBUG nova.objects.instance [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lazy-loading 'resources' on Instance uuid 7a223382-86d1-478e-8324-01ef43aef7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:00.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.728 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:12:00 np0005588919 nova_compute[225855]: 2026-01-20 15:12:00.729 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.054 225859 INFO nova.virt.libvirt.driver [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deleting instance files /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1_del#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.054 225859 INFO nova.virt.libvirt.driver [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deletion of /var/lib/nova/instances/7a223382-86d1-478e-8324-01ef43aef7e1_del complete#033[00m
Jan 20 10:12:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:01.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.108 225859 INFO nova.compute.manager [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Took 0.64 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.109 225859 DEBUG oslo.service.loopingcall [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.109 225859 DEBUG nova.compute.manager [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.109 225859 DEBUG nova.network.neutron [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.379 225859 DEBUG nova.network.neutron [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.396 225859 DEBUG nova.network.neutron [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.415 225859 INFO nova.compute.manager [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Took 0.31 seconds to deallocate network for instance.#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.461 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.461 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.502 225859 DEBUG oslo_concurrency.processutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:01 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3816362426' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.954 225859 DEBUG oslo_concurrency.processutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.962 225859 DEBUG nova.compute.provider_tree [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.977 225859 DEBUG nova.scheduler.client.report [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:01 np0005588919 nova_compute[225855]: 2026-01-20 15:12:01.997 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:02 np0005588919 nova_compute[225855]: 2026-01-20 15:12:02.026 225859 INFO nova.scheduler.client.report [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Deleted allocations for instance 7a223382-86d1-478e-8324-01ef43aef7e1#033[00m
Jan 20 10:12:02 np0005588919 nova_compute[225855]: 2026-01-20 15:12:02.120 225859 DEBUG oslo_concurrency.lockutils [None req-f40fb6eb-0551-43ad-b875-1ad5007a8abc 8e699d2accae4b489d779507db44504e 37c01d33832740c3ba018515e081285b - - default default] Lock "7a223382-86d1-478e-8324-01ef43aef7e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:02.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:02 np0005588919 nova_compute[225855]: 2026-01-20 15:12:02.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:03.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:03 np0005588919 nova_compute[225855]: 2026-01-20 15:12:03.729 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:03 np0005588919 nova_compute[225855]: 2026-01-20 15:12:03.729 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:04.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:05.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:05 np0005588919 nova_compute[225855]: 2026-01-20 15:12:05.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:06.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:07.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:07 np0005588919 nova_compute[225855]: 2026-01-20 15:12:07.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:08 np0005588919 podman[299943]: 2026-01-20 15:12:08.145129421 +0000 UTC m=+0.172783633 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 20 10:12:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:08.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:09.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:10 np0005588919 nova_compute[225855]: 2026-01-20 15:12:10.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:10.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:11.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:12.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:12 np0005588919 nova_compute[225855]: 2026-01-20 15:12:12.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:13.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.537 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.537 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.555 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:12:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:14.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.794 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.795 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.805 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.805 225859 INFO nova.compute.claims [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:12:14 np0005588919 nova_compute[225855]: 2026-01-20 15:12:14.951 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:15.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.088 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/514934175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.418 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.424 225859 DEBUG nova.compute.provider_tree [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.458 225859 DEBUG nova.scheduler.client.report [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.514 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.515 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.592 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.614 225859 INFO nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.634 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.690 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921920.6896899, 7a223382-86d1-478e-8324-01ef43aef7e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.691 225859 INFO nova.compute.manager [-] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.745 225859 DEBUG nova.compute.manager [None req-60a59d7f-7914-488e-b24a-acba2f4e8484 - - - - - -] [instance: 7a223382-86d1-478e-8324-01ef43aef7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.791 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.792 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.793 225859 INFO nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating image(s)#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.828 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.868 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.903 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.907 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.971 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.973 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.974 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:15 np0005588919 nova_compute[225855]: 2026-01-20 15:12:15.974 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.001 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.006 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.336 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.427 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] resizing rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:16.431 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:16.432 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:16.432 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.536 225859 DEBUG nova.objects.instance [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.563 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.563 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Ensure instance console log exists: /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.564 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.564 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.565 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.566 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.572 225859 WARNING nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.578 225859 DEBUG nova.virt.libvirt.host [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.579 225859 DEBUG nova.virt.libvirt.host [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.584 225859 DEBUG nova.virt.libvirt.host [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.585 225859 DEBUG nova.virt.libvirt.host [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.587 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.588 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.589 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.589 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.590 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.590 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.591 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.591 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.592 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.592 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.593 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.593 225859 DEBUG nova.virt.hardware [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:12:16 np0005588919 nova_compute[225855]: 2026-01-20 15:12:16.599 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:16.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:17.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/394228248' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.105 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.133 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.138 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/636621959' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.607 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.609 225859 DEBUG nova.objects.instance [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.638 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <uuid>4e7e9bf1-528e-4390-8d23-3ab48889e23c</uuid>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <name>instance-000000b4</name>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerShowV257Test-server-1886200183</nova:name>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:12:16</nova:creationTime>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:user uuid="a73d4f13c0bf4d1c9497cd04e5db6724">tempest-ServerShowV257Test-1887808980-project-member</nova:user>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <nova:project uuid="8550ab6f7bdb4d9faa423c65e76a6818">tempest-ServerShowV257Test-1887808980</nova:project>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="serial">4e7e9bf1-528e-4390-8d23-3ab48889e23c</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="uuid">4e7e9bf1-528e-4390-8d23-3ab48889e23c</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/console.log" append="off"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:12:17 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:12:17 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:12:17 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:12:17 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.702 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.703 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.704 225859 INFO nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Using config drive#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.746 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:17 np0005588919 nova_compute[225855]: 2026-01-20 15:12:17.752 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.260 225859 INFO nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating config drive at /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.266 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoq0lg9sm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.408 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoq0lg9sm" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.455 225859 DEBUG nova.storage.rbd_utils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.461 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.660 225859 DEBUG oslo_concurrency.processutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:18 np0005588919 nova_compute[225855]: 2026-01-20 15:12:18.662 225859 INFO nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deleting local config drive /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:12:18 np0005588919 systemd-machined[194361]: New machine qemu-94-instance-000000b4.
Jan 20 10:12:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:18.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:18 np0005588919 systemd[1]: Started Virtual Machine qemu-94-instance-000000b4.
Jan 20 10:12:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:19.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.092 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921939.0919037, 4e7e9bf1-528e-4390-8d23-3ab48889e23c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.094 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.098 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.099 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.105 225859 INFO nova.virt.libvirt.driver [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance spawned successfully.#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.106 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.127 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.136 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.145 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.146 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.147 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.148 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.149 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.150 225859 DEBUG nova.virt.libvirt.driver [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.188 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.189 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921939.093336, 4e7e9bf1-528e-4390-8d23-3ab48889e23c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.189 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.226 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.231 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.245 225859 INFO nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Took 3.45 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.246 225859 DEBUG nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.260 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.333 225859 INFO nova.compute.manager [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Took 4.61 seconds to build instance.#033[00m
Jan 20 10:12:19 np0005588919 nova_compute[225855]: 2026-01-20 15:12:19.394 225859 DEBUG oslo_concurrency.lockutils [None req-c45cd0ee-36ca-4caa-8e42-97cd4c7dc0e0 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:20 np0005588919 nova_compute[225855]: 2026-01-20 15:12:20.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:20.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:20 np0005588919 nova_compute[225855]: 2026-01-20 15:12:20.830 225859 INFO nova.compute.manager [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Rebuilding instance#033[00m
Jan 20 10:12:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:21.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.253 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.289 225859 DEBUG nova.compute.manager [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.428 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.446 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.467 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'resources' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.483 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.497 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:12:21 np0005588919 nova_compute[225855]: 2026-01-20 15:12:21.506 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.595084) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941595135, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 1634, "num_deletes": 257, "total_data_size": 3389258, "memory_usage": 3443840, "flush_reason": "Manual Compaction"}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941611108, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 2233908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 63436, "largest_seqno": 65065, "table_properties": {"data_size": 2227138, "index_size": 3776, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 15019, "raw_average_key_size": 20, "raw_value_size": 2213210, "raw_average_value_size": 2970, "num_data_blocks": 166, "num_entries": 745, "num_filter_entries": 745, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921818, "oldest_key_time": 1768921818, "file_creation_time": 1768921941, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 16054 microseconds, and 5042 cpu microseconds.
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.611136) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 2233908 bytes OK
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.611152) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612403) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612412) EVENT_LOG_v1 {"time_micros": 1768921941612409, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.612426) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 3381691, prev total WAL file size 3381691, number of live WAL files 2.
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.613166) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323730' seq:72057594037927935, type:22 .. '6C6F676D0032353231' seq:0, type:0; will stop at (end)
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(2181KB)], [126(11MB)]
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941613246, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 14489423, "oldest_snapshot_seqno": -1}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 9002 keys, 14341651 bytes, temperature: kUnknown
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941789056, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 14341651, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14279201, "index_size": 38849, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22533, "raw_key_size": 235900, "raw_average_key_size": 26, "raw_value_size": 14116626, "raw_average_value_size": 1568, "num_data_blocks": 1503, "num_entries": 9002, "num_filter_entries": 9002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921941, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.789380) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 14341651 bytes
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.791032) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 82.4 rd, 81.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.7 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(12.9) write-amplify(6.4) OK, records in: 9533, records dropped: 531 output_compression: NoCompression
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.791077) EVENT_LOG_v1 {"time_micros": 1768921941791060, "job": 80, "event": "compaction_finished", "compaction_time_micros": 175919, "compaction_time_cpu_micros": 56538, "output_level": 6, "num_output_files": 1, "total_output_size": 14341651, "num_input_records": 9533, "num_output_records": 9002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941791800, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921941794314, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.613066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.794454) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.794459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.794461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.794462) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:21 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:21.794463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:22 np0005588919 podman[300393]: 2026-01-20 15:12:22.026075481 +0000 UTC m=+0.068127164 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 10:12:22 np0005588919 nova_compute[225855]: 2026-01-20 15:12:22.727 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:22.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:23.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:24.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:25.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:25 np0005588919 nova_compute[225855]: 2026-01-20 15:12:25.094 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:26 np0005588919 podman[300587]: 2026-01-20 15:12:26.496974806 +0000 UTC m=+0.068564906 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:12:26 np0005588919 podman[300587]: 2026-01-20 15:12:26.617212498 +0000 UTC m=+0.188802598 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 20 10:12:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:26.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:27.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:27 np0005588919 podman[300741]: 2026-01-20 15:12:27.220725832 +0000 UTC m=+0.051941665 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:12:27 np0005588919 podman[300741]: 2026-01-20 15:12:27.233186005 +0000 UTC m=+0.064401838 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:12:27 np0005588919 podman[300802]: 2026-01-20 15:12:27.478216207 +0000 UTC m=+0.059670534 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, architecture=x86_64, io.buildah.version=1.28.2, vcs-type=git, version=2.2.4, description=keepalived for Ceph, name=keepalived, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, build-date=2023-02-22T09:23:20)
Jan 20 10:12:27 np0005588919 podman[300802]: 2026-01-20 15:12:27.489188668 +0000 UTC m=+0.070642995 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=keepalived, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vendor=Red Hat, Inc.)
Jan 20 10:12:27 np0005588919 nova_compute[225855]: 2026-01-20 15:12:27.729 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:28.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:12:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:12:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:29.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:30 np0005588919 nova_compute[225855]: 2026-01-20 15:12:30.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:30.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:31.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:31 np0005588919 nova_compute[225855]: 2026-01-20 15:12:31.556 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 20 10:12:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:32.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:32 np0005588919 nova_compute[225855]: 2026-01-20 15:12:32.756 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:33.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:33.316 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:12:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:33.317 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:12:33 np0005588919 nova_compute[225855]: 2026-01-20 15:12:33.318 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:34 np0005588919 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 20 10:12:34 np0005588919 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000b4.scope: Consumed 12.581s CPU time.
Jan 20 10:12:34 np0005588919 systemd-machined[194361]: Machine qemu-94-instance-000000b4 terminated.
Jan 20 10:12:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:12:34.320 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:12:34 np0005588919 nova_compute[225855]: 2026-01-20 15:12:34.572 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance shutdown successfully after 13 seconds.#033[00m
Jan 20 10:12:34 np0005588919 nova_compute[225855]: 2026-01-20 15:12:34.578 225859 INFO nova.virt.libvirt.driver [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance destroyed successfully.#033[00m
Jan 20 10:12:34 np0005588919 nova_compute[225855]: 2026-01-20 15:12:34.582 225859 INFO nova.virt.libvirt.driver [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance destroyed successfully.#033[00m
Jan 20 10:12:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:34.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:35 np0005588919 nova_compute[225855]: 2026-01-20 15:12:35.101 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:35.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:35 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.194 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deleting instance files /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c_del#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.195 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deletion of /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c_del complete#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.439 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.439 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating image(s)#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.473 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.510 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.544 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.549 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.650 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.652 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.653 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.654 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "a4ed0d2b98aa460c005e878d78a49ccb6f511f7c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.691 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:36 np0005588919 nova_compute[225855]: 2026-01-20 15:12:36.695 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:36.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:37.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.158 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.232 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] resizing rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.332 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.333 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Ensure instance console log exists: /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.334 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.334 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.335 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.337 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.341 225859 WARNING nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.348 225859 DEBUG nova.virt.libvirt.host [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.348 225859 DEBUG nova.virt.libvirt.host [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.352 225859 DEBUG nova.virt.libvirt.host [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.352 225859 DEBUG nova.virt.libvirt.host [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.354 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.354 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:22:02Z,direct_url=<?>,disk_format='qcow2',id=26699514-f465-4b50-98b7-36f2cfc6a308,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.355 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.355 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.356 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.356 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.356 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.356 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.357 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.357 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.357 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.358 225859 DEBUG nova.virt.hardware [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.358 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.377 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.758 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2691945406' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.806 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.834 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:37 np0005588919 nova_compute[225855]: 2026-01-20 15:12:37.838 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 20 10:12:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:12:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1378670698' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.298 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.301 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <uuid>4e7e9bf1-528e-4390-8d23-3ab48889e23c</uuid>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <name>instance-000000b4</name>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:name>tempest-ServerShowV257Test-server-1886200183</nova:name>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:12:37</nova:creationTime>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:user uuid="a73d4f13c0bf4d1c9497cd04e5db6724">tempest-ServerShowV257Test-1887808980-project-member</nova:user>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <nova:project uuid="8550ab6f7bdb4d9faa423c65e76a6818">tempest-ServerShowV257Test-1887808980</nova:project>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="26699514-f465-4b50-98b7-36f2cfc6a308"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <nova:ports/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="serial">4e7e9bf1-528e-4390-8d23-3ab48889e23c</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="uuid">4e7e9bf1-528e-4390-8d23-3ab48889e23c</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/console.log" append="off"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:12:38 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:12:38 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:12:38 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:12:38 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:12:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.389 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.390 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.391 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Using config drive#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.421 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:38 np0005588919 podman[301271]: 2026-01-20 15:12:38.486813219 +0000 UTC m=+0.142544105 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.492 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.526 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'keypairs' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.736 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Creating config drive at /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.742 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2iqg0srg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:38.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.875 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2iqg0srg" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.911 225859 DEBUG nova.storage.rbd_utils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] rbd image 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:12:38 np0005588919 nova_compute[225855]: 2026-01-20 15:12:38.915 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:39.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:39 np0005588919 nova_compute[225855]: 2026-01-20 15:12:39.159 225859 DEBUG oslo_concurrency.processutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config 4e7e9bf1-528e-4390-8d23-3ab48889e23c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.244s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:39 np0005588919 nova_compute[225855]: 2026-01-20 15:12:39.160 225859 INFO nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deleting local config drive /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:12:39 np0005588919 systemd-machined[194361]: New machine qemu-95-instance-000000b4.
Jan 20 10:12:39 np0005588919 systemd[1]: Started Virtual Machine qemu-95-instance-000000b4.
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.029 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 4e7e9bf1-528e-4390-8d23-3ab48889e23c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.030 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921960.028581, 4e7e9bf1-528e-4390-8d23-3ab48889e23c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.030 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.033 225859 DEBUG nova.compute.manager [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.033 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.036 225859 INFO nova.virt.libvirt.driver [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance spawned successfully.#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.037 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.056 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.060 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.068 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.069 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.070 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.070 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.071 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.071 225859 DEBUG nova.virt.libvirt.driver [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.099 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.100 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768921960.0287015, 4e7e9bf1-528e-4390-8d23-3ab48889e23c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.100 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.104 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.174 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.178 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.193 225859 DEBUG nova.compute.manager [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.208 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.283 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.284 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.284 225859 DEBUG nova.objects.instance [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 20 10:12:40 np0005588919 nova_compute[225855]: 2026-01-20 15:12:40.413 225859 DEBUG oslo_concurrency.lockutils [None req-ae4beab2-80e3-49ac-a455-5af63bdede94 a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:40.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:41.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.786437) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961786473, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 540, "num_deletes": 252, "total_data_size": 825110, "memory_usage": 836160, "flush_reason": "Manual Compaction"}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961792700, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 472423, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65070, "largest_seqno": 65605, "table_properties": {"data_size": 469520, "index_size": 874, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7759, "raw_average_key_size": 21, "raw_value_size": 463546, "raw_average_value_size": 1266, "num_data_blocks": 36, "num_entries": 366, "num_filter_entries": 366, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921942, "oldest_key_time": 1768921942, "file_creation_time": 1768921961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 6297 microseconds, and 2215 cpu microseconds.
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.792732) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 472423 bytes OK
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.792749) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794838) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794897) EVENT_LOG_v1 {"time_micros": 1768921961794888, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.794922) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 821918, prev total WAL file size 821918, number of live WAL files 2.
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.795422) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303035' seq:72057594037927935, type:22 .. '6D6772737461740032323538' seq:0, type:0; will stop at (end)
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(461KB)], [129(13MB)]
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961795459, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 14814074, "oldest_snapshot_seqno": -1}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8848 keys, 10986658 bytes, temperature: kUnknown
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961877963, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 10986658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10929787, "index_size": 33654, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22149, "raw_key_size": 232931, "raw_average_key_size": 26, "raw_value_size": 10774459, "raw_average_value_size": 1217, "num_data_blocks": 1287, "num_entries": 8848, "num_filter_entries": 8848, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768921961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.878274) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 10986658 bytes
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.879769) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.3 rd, 133.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 13.7 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(54.6) write-amplify(23.3) OK, records in: 9368, records dropped: 520 output_compression: NoCompression
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.879789) EVENT_LOG_v1 {"time_micros": 1768921961879780, "job": 82, "event": "compaction_finished", "compaction_time_micros": 82615, "compaction_time_cpu_micros": 28191, "output_level": 6, "num_output_files": 1, "total_output_size": 10986658, "num_input_records": 9368, "num_output_records": 8848, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961880014, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768921961882679, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.795347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.882776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.882782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.882785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.882787) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:12:41.882789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.227 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.228 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.228 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.229 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.229 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.230 225859 INFO nova.compute.manager [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Terminating instance#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.231 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "refresh_cache-4e7e9bf1-528e-4390-8d23-3ab48889e23c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.231 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquired lock "refresh_cache-4e7e9bf1-528e-4390-8d23-3ab48889e23c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.232 225859 DEBUG nova.network.neutron [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.702 225859 DEBUG nova.network.neutron [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:12:42 np0005588919 nova_compute[225855]: 2026-01-20 15:12:42.759 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:42.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:12:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:43.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:12:43 np0005588919 nova_compute[225855]: 2026-01-20 15:12:43.167 225859 DEBUG nova.network.neutron [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:43 np0005588919 nova_compute[225855]: 2026-01-20 15:12:43.209 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Releasing lock "refresh_cache-4e7e9bf1-528e-4390-8d23-3ab48889e23c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:12:43 np0005588919 nova_compute[225855]: 2026-01-20 15:12:43.210 225859 DEBUG nova.compute.manager [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:12:43 np0005588919 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 20 10:12:43 np0005588919 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b4.scope: Consumed 4.121s CPU time.
Jan 20 10:12:43 np0005588919 systemd-machined[194361]: Machine qemu-95-instance-000000b4 terminated.
Jan 20 10:12:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e393 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:43 np0005588919 nova_compute[225855]: 2026-01-20 15:12:43.433 225859 INFO nova.virt.libvirt.driver [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance destroyed successfully.#033[00m
Jan 20 10:12:43 np0005588919 nova_compute[225855]: 2026-01-20 15:12:43.433 225859 DEBUG nova.objects.instance [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lazy-loading 'resources' on Instance uuid 4e7e9bf1-528e-4390-8d23-3ab48889e23c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:12:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:12:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1969015913' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:12:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:12:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1969015913' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:12:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.382 225859 INFO nova.virt.libvirt.driver [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deleting instance files /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c_del#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.382 225859 INFO nova.virt.libvirt.driver [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deletion of /var/lib/nova/instances/4e7e9bf1-528e-4390-8d23-3ab48889e23c_del complete#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.451 225859 INFO nova.compute.manager [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Took 1.24 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.452 225859 DEBUG oslo.service.loopingcall [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.452 225859 DEBUG nova.compute.manager [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.452 225859 DEBUG nova.network.neutron [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.591 225859 DEBUG nova.network.neutron [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.619 225859 DEBUG nova.network.neutron [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.634 225859 INFO nova.compute.manager [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Took 0.18 seconds to deallocate network for instance.#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.711 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.711 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:44.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:44 np0005588919 nova_compute[225855]: 2026-01-20 15:12:44.798 225859 DEBUG oslo_concurrency.processutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.107 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:45.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:45 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:45 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/358433427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.237 225859 DEBUG oslo_concurrency.processutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.243 225859 DEBUG nova.compute.provider_tree [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.273 225859 DEBUG nova.scheduler.client.report [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.297 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.339 225859 INFO nova.scheduler.client.report [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Deleted allocations for instance 4e7e9bf1-528e-4390-8d23-3ab48889e23c#033[00m
Jan 20 10:12:45 np0005588919 nova_compute[225855]: 2026-01-20 15:12:45.411 225859 DEBUG oslo_concurrency.lockutils [None req-5048e29f-dd5f-45f2-b955-36cf3538c28a a73d4f13c0bf4d1c9497cd04e5db6724 8550ab6f7bdb4d9faa423c65e76a6818 - - default default] Lock "4e7e9bf1-528e-4390-8d23-3ab48889e23c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:46.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 20 10:12:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:47 np0005588919 nova_compute[225855]: 2026-01-20 15:12:47.761 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e395 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:48.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:12:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:49.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:12:50 np0005588919 nova_compute[225855]: 2026-01-20 15:12:50.111 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 20 10:12:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:50.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:51.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:52 np0005588919 nova_compute[225855]: 2026-01-20 15:12:52.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:12:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:52.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:12:53 np0005588919 podman[301514]: 2026-01-20 15:12:53.040661683 +0000 UTC m=+0.073560728 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:12:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:53.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:53 np0005588919 nova_compute[225855]: 2026-01-20 15:12:53.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:53 np0005588919 nova_compute[225855]: 2026-01-20 15:12:53.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:12:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:54 np0005588919 nova_compute[225855]: 2026-01-20 15:12:54.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:54.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:55 np0005588919 nova_compute[225855]: 2026-01-20 15:12:55.114 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:55.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:56 np0005588919 nova_compute[225855]: 2026-01-20 15:12:56.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:56 np0005588919 nova_compute[225855]: 2026-01-20 15:12:56.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:12:56 np0005588919 nova_compute[225855]: 2026-01-20 15:12:56.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:12:56 np0005588919 nova_compute[225855]: 2026-01-20 15:12:56.373 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:12:56 np0005588919 nova_compute[225855]: 2026-01-20 15:12:56.374 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:56 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 20 10:12:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:56.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:57.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:57 np0005588919 nova_compute[225855]: 2026-01-20 15:12:57.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:57 np0005588919 nova_compute[225855]: 2026-01-20 15:12:57.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:57 np0005588919 nova_compute[225855]: 2026-01-20 15:12:57.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:12:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:12:58 np0005588919 nova_compute[225855]: 2026-01-20 15:12:58.432 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768921963.4311595, 4e7e9bf1-528e-4390-8d23-3ab48889e23c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:12:58 np0005588919 nova_compute[225855]: 2026-01-20 15:12:58.433 225859 INFO nova.compute.manager [-] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:12:58 np0005588919 nova_compute[225855]: 2026-01-20 15:12:58.468 225859 DEBUG nova.compute.manager [None req-79dc6c96-6eff-4694-bad6-b731a6583f0a - - - - - -] [instance: 4e7e9bf1-528e-4390-8d23-3ab48889e23c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:12:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:12:58.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:12:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:12:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:12:59.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.338 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.366 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.367 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.368 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:12:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:12:59 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/270906769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:12:59 np0005588919 nova_compute[225855]: 2026-01-20 15:12:59.879 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.036 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.037 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4266MB free_disk=20.942672729492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.037 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.037 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.115 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.116 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.151 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3721231144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.599 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.604 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.627 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.652 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:13:00 np0005588919 nova_compute[225855]: 2026-01-20 15:13:00.653 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:00.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:01.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:02 np0005588919 nova_compute[225855]: 2026-01-20 15:13:02.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:02.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:03.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:04 np0005588919 nova_compute[225855]: 2026-01-20 15:13:04.655 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:04.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:05 np0005588919 nova_compute[225855]: 2026-01-20 15:13:05.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:05.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:05 np0005588919 nova_compute[225855]: 2026-01-20 15:13:05.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:06.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:07 np0005588919 nova_compute[225855]: 2026-01-20 15:13:07.771 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:08.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:09 np0005588919 podman[301636]: 2026-01-20 15:13:09.025482158 +0000 UTC m=+0.074832375 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:13:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:09.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:10 np0005588919 nova_compute[225855]: 2026-01-20 15:13:10.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:10.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:11.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:12 np0005588919 nova_compute[225855]: 2026-01-20 15:13:12.774 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:12.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:13.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:14.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:15 np0005588919 nova_compute[225855]: 2026-01-20 15:13:15.129 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:16.434 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:16.435 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:16.435 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:16.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:17 np0005588919 nova_compute[225855]: 2026-01-20 15:13:17.336 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:17 np0005588919 nova_compute[225855]: 2026-01-20 15:13:17.831 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:18.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:19.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:20 np0005588919 nova_compute[225855]: 2026-01-20 15:13:20.132 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:20.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:21.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:22.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:22 np0005588919 nova_compute[225855]: 2026-01-20 15:13:22.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:23.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:23 np0005588919 podman[301720]: 2026-01-20 15:13:23.999305587 +0000 UTC m=+0.043723102 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:13:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:24.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:25 np0005588919 nova_compute[225855]: 2026-01-20 15:13:25.136 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:25.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3326666623' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.377 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.377 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.396 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.485 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.486 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.491 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:13:26 np0005588919 nova_compute[225855]: 2026-01-20 15:13:26.492 225859 INFO nova.compute.claims [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:13:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:26.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.046 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.402075) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007402096, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 771, "num_deletes": 254, "total_data_size": 1275267, "memory_usage": 1297488, "flush_reason": "Manual Compaction"}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007409762, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 840017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65610, "largest_seqno": 66376, "table_properties": {"data_size": 836370, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8828, "raw_average_key_size": 19, "raw_value_size": 828875, "raw_average_value_size": 1871, "num_data_blocks": 63, "num_entries": 443, "num_filter_entries": 443, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768921962, "oldest_key_time": 1768921962, "file_creation_time": 1768922007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 7731 microseconds, and 2446 cpu microseconds.
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.409804) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 840017 bytes OK
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.409818) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.410934) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.410945) EVENT_LOG_v1 {"time_micros": 1768922007410941, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.410961) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 1271164, prev total WAL file size 1271164, number of live WAL files 2.
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.411430) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(820KB)], [132(10MB)]
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007411525, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 11826675, "oldest_snapshot_seqno": -1}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1030759791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.478 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.483 225859 DEBUG nova.compute.provider_tree [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8769 keys, 9951772 bytes, temperature: kUnknown
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007497427, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 9951772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9896409, "index_size": 32338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21957, "raw_key_size": 232044, "raw_average_key_size": 26, "raw_value_size": 9743382, "raw_average_value_size": 1111, "num_data_blocks": 1224, "num_entries": 8769, "num_filter_entries": 8769, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922007, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.497744) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 9951772 bytes
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.499277) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.5 rd, 115.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 10.5 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(25.9) write-amplify(11.8) OK, records in: 9291, records dropped: 522 output_compression: NoCompression
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.499304) EVENT_LOG_v1 {"time_micros": 1768922007499291, "job": 84, "event": "compaction_finished", "compaction_time_micros": 86014, "compaction_time_cpu_micros": 32812, "output_level": 6, "num_output_files": 1, "total_output_size": 9951772, "num_input_records": 9291, "num_output_records": 8769, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007499853, "job": 84, "event": "table_file_deletion", "file_number": 134}
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.500 225859 DEBUG nova.scheduler.client.report [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922007502306, "job": 84, "event": "table_file_deletion", "file_number": 132}
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.411286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:13:27.502467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.536 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.537 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.653 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.653 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.830 225859 DEBUG nova.policy [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.836 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:27 np0005588919 nova_compute[225855]: 2026-01-20 15:13:27.978 225859 INFO nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.017 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.209 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.212 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.213 225859 INFO nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Creating image(s)#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.252 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.283 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.311 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.315 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.394 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.394 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.395 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.395 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.429 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.433 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 128af7d9-155f-468d-9873-98c816f0df9e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.601 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Successfully created port: 9de5453d-b548-429c-8fc2-7b012cb8ebdf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.761 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 128af7d9-155f-468d-9873-98c816f0df9e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.329s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.828 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:13:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:28.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.947 225859 DEBUG nova.objects.instance [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid 128af7d9-155f-468d-9873-98c816f0df9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.987 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.988 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Ensure instance console log exists: /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.989 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.989 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:28 np0005588919 nova_compute[225855]: 2026-01-20 15:13:28.989 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:29.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:29 np0005588919 nova_compute[225855]: 2026-01-20 15:13:29.969 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Successfully updated port: 9de5453d-b548-429c-8fc2-7b012cb8ebdf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.047 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.047 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.048 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.217 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:13:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:30 np0005588919 nova_compute[225855]: 2026-01-20 15:13:30.940 225859 DEBUG nova.network.neutron [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.593 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.593 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance network_info: |[{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.595 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start _get_guest_xml network_info=[{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.600 225859 WARNING nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.605 225859 DEBUG nova.virt.libvirt.host [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.606 225859 DEBUG nova.virt.libvirt.host [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.609 225859 DEBUG nova.virt.libvirt.host [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.610 225859 DEBUG nova.virt.libvirt.host [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.611 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.611 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.612 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.612 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.613 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.613 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.613 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.613 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.614 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.614 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.614 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.614 225859 DEBUG nova.virt.hardware [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:13:31 np0005588919 nova_compute[225855]: 2026-01-20 15:13:31.617 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1603427827' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.041 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.065 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.070 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.101 225859 DEBUG nova.compute.manager [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.102 225859 DEBUG nova.compute.manager [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing instance network info cache due to event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.103 225859 DEBUG oslo_concurrency.lockutils [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.103 225859 DEBUG oslo_concurrency.lockutils [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.103 225859 DEBUG nova.network.neutron [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:13:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2022380678' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.494 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.496 225859 DEBUG nova.virt.libvirt.vif [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:28Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.496 225859 DEBUG nova.network.os_vif_util [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.497 225859 DEBUG nova.network.os_vif_util [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.498 225859 DEBUG nova.objects.instance [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 128af7d9-155f-468d-9873-98c816f0df9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.517 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <uuid>128af7d9-155f-468d-9873-98c816f0df9e</uuid>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <name>instance-000000b7</name>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1998945962</nova:name>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:13:31</nova:creationTime>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <nova:port uuid="9de5453d-b548-429c-8fc2-7b012cb8ebdf">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="serial">128af7d9-155f-468d-9873-98c816f0df9e</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="uuid">128af7d9-155f-468d-9873-98c816f0df9e</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/128af7d9-155f-468d-9873-98c816f0df9e_disk">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/128af7d9-155f-468d-9873-98c816f0df9e_disk.config">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a8:1d:e9"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <target dev="tap9de5453d-b5"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/console.log" append="off"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:13:32 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:13:32 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:13:32 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:13:32 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.518 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Preparing to wait for external event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.518 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.519 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.519 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.519 225859 DEBUG nova.virt.libvirt.vif [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:13:28Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.520 225859 DEBUG nova.network.os_vif_util [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.520 225859 DEBUG nova.network.os_vif_util [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.521 225859 DEBUG os_vif [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.522 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.522 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.525 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.526 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9de5453d-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.526 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9de5453d-b5, col_values=(('external_ids', {'iface-id': '9de5453d-b548-429c-8fc2-7b012cb8ebdf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:1d:e9', 'vm-uuid': '128af7d9-155f-468d-9873-98c816f0df9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.528 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588919 NetworkManager[49104]: <info>  [1768922012.5289] manager: (tap9de5453d-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.530 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.534 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.535 225859 INFO os_vif [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5')#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.632 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.633 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.633 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:a8:1d:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.634 225859 INFO nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Using config drive#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.656 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:32 np0005588919 nova_compute[225855]: 2026-01-20 15:13:32.838 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:32.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.579 225859 INFO nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Creating config drive at /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.588 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xwn_xtb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.727 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xwn_xtb" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.758 225859 DEBUG nova.storage.rbd_utils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 128af7d9-155f-468d-9873-98c816f0df9e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.761 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config 128af7d9-155f-468d-9873-98c816f0df9e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.924 225859 DEBUG oslo_concurrency.processutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config 128af7d9-155f-468d-9873-98c816f0df9e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.925 225859 INFO nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Deleting local config drive /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/disk.config because it was imported into RBD.#033[00m
Jan 20 10:13:33 np0005588919 kernel: tap9de5453d-b5: entered promiscuous mode
Jan 20 10:13:33 np0005588919 NetworkManager[49104]: <info>  [1768922013.9714] manager: (tap9de5453d-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Jan 20 10:13:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:33Z|00806|binding|INFO|Claiming lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf for this chassis.
Jan 20 10:13:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:33Z|00807|binding|INFO|9de5453d-b548-429c-8fc2-7b012cb8ebdf: Claiming fa:16:3e:a8:1d:e9 10.100.0.4
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.971 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:33 np0005588919 nova_compute[225855]: 2026-01-20 15:13:33.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.002 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:1d:e9 10.100.0.4'], port_security=['fa:16:3e:a8:1d:e9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '128af7d9-155f-468d-9873-98c816f0df9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d07527d3-7363-453c-9902-c562bab626ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4895263-5fc5-4c5a-ab8d-547f570bc095', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4d49b6-1d42-4171-8055-0d823fb37e66, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9de5453d-b548-429c-8fc2-7b012cb8ebdf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.003 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9de5453d-b548-429c-8fc2-7b012cb8ebdf in datapath d07527d3-7363-453c-9902-c562bab626ba bound to our chassis#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.005 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d07527d3-7363-453c-9902-c562bab626ba#033[00m
Jan 20 10:13:34 np0005588919 systemd-udevd[302066]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:34 np0005588919 systemd-machined[194361]: New machine qemu-96-instance-000000b7.
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.016 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15eb01ff-dad7-42df-94c3-efe6059dea97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.016 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd07527d3-71 in ovnmeta-d07527d3-7363-453c-9902-c562bab626ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:13:34 np0005588919 NetworkManager[49104]: <info>  [1768922014.0192] device (tap9de5453d-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:13:34 np0005588919 NetworkManager[49104]: <info>  [1768922014.0198] device (tap9de5453d-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.020 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd07527d3-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.020 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7a77e0bf-4ef6-4ff2-a72e-fca71ea8e8f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.021 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[58b76739-b825-4cf3-b09e-9c0debf2b53d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.034 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[365767fb-f121-4c69-995d-b09874daf622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 systemd[1]: Started Virtual Machine qemu-96-instance-000000b7.
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.045 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:34Z|00808|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf ovn-installed in OVS
Jan 20 10:13:34 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:34Z|00809|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf up in Southbound
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.052 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.051 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d84059eb-9b5b-4f5e-9220-b3e427702163]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.084 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[16900d6d-9e90-4be4-a9b3-fe4ea42b97e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.088 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8b16e8e3-3ac6-4cb1-89e2-e464753ff23a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 NetworkManager[49104]: <info>  [1768922014.0897] manager: (tapd07527d3-70): new Veth device (/org/freedesktop/NetworkManager/Devices/334)
Jan 20 10:13:34 np0005588919 systemd-udevd[302070]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.120 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[830638b4-bcf1-47ae-a5e5-53a401312cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.123 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[da4b25a8-5a94-427f-9626-b72c80cb220c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 NetworkManager[49104]: <info>  [1768922014.1434] device (tapd07527d3-70): carrier: link connected
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.148 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[42d4233e-16a0-473b-a8d7-63ebe3c94f7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.164 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c52064b1-2044-4102-838e-72d7be024d9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd07527d3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:33:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705307, 'reachable_time': 39290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302100, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41040360-33d4-4221-8bd4-5e343d382103]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:33a8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 705307, 'tstamp': 705307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302101, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.194 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d35769dd-e888-4201-bb69-c18e06e9b6d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd07527d3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:33:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705307, 'reachable_time': 39290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302102, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.225 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4982a898-3f0b-4f2d-b67d-d0804b9c3ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.273 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5a69110f-3c62-4e8b-8b6d-7b00e2073557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.274 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd07527d3-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.274 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.275 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd07527d3-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:34 np0005588919 NetworkManager[49104]: <info>  [1768922014.2772] manager: (tapd07527d3-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 kernel: tapd07527d3-70: entered promiscuous mode
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.280 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd07527d3-70, col_values=(('external_ids', {'iface-id': '311d5bf2-0b44-4ce1-9ec1-e7458d5df232'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:34 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:34Z|00810|binding|INFO|Releasing lport 311d5bf2-0b44-4ce1-9ec1-e7458d5df232 from this chassis (sb_readonly=0)
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.302 225859 DEBUG nova.compute.manager [req-b63561c7-a5c3-4289-9bb4-6935aaa8a698 req-f9ba7d1d-5f12-4ddf-9598-d782da4a5b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.303 225859 DEBUG oslo_concurrency.lockutils [req-b63561c7-a5c3-4289-9bb4-6935aaa8a698 req-f9ba7d1d-5f12-4ddf-9598-d782da4a5b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.303 225859 DEBUG oslo_concurrency.lockutils [req-b63561c7-a5c3-4289-9bb4-6935aaa8a698 req-f9ba7d1d-5f12-4ddf-9598-d782da4a5b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.303 225859 DEBUG oslo_concurrency.lockutils [req-b63561c7-a5c3-4289-9bb4-6935aaa8a698 req-f9ba7d1d-5f12-4ddf-9598-d782da4a5b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.303 225859 DEBUG nova.compute.manager [req-b63561c7-a5c3-4289-9bb4-6935aaa8a698 req-f9ba7d1d-5f12-4ddf-9598-d782da4a5b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Processing event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.305 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.306 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[87e6d122-3c60-43aa-b493-4d389d16466c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.307 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-d07527d3-7363-453c-9902-c562bab626ba
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID d07527d3-7363-453c-9902-c562bab626ba
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:13:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:34.309 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'env', 'PROCESS_TAG=haproxy-d07527d3-7363-453c-9902-c562bab626ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d07527d3-7363-453c-9902-c562bab626ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.351 225859 DEBUG nova.network.neutron [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated VIF entry in instance network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.351 225859 DEBUG nova.network.neutron [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:34 np0005588919 nova_compute[225855]: 2026-01-20 15:13:34.379 225859 DEBUG oslo_concurrency.lockutils [req-de2a3c0f-8b93-454c-8732-1b48901607c0 req-dfc04484-6634-4fb7-90c0-e9c764b7da19 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:34 np0005588919 podman[302132]: 2026-01-20 15:13:34.64951051 +0000 UTC m=+0.049345991 container create 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:13:34 np0005588919 systemd[1]: Started libpod-conmon-9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155.scope.
Jan 20 10:13:34 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:13:34 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19af57efcf5c610cf3b2724d9be132eb303f0d0071822b6d38a243dd05cdc13e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:13:34 np0005588919 podman[302132]: 2026-01-20 15:13:34.623960655 +0000 UTC m=+0.023796156 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:13:34 np0005588919 podman[302132]: 2026-01-20 15:13:34.721232795 +0000 UTC m=+0.121068296 container init 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:13:34 np0005588919 podman[302132]: 2026-01-20 15:13:34.727033309 +0000 UTC m=+0.126868800 container start 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:13:34 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [NOTICE]   (302166) : New worker (302169) forked
Jan 20 10:13:34 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [NOTICE]   (302166) : Loading success.
Jan 20 10:13:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:34.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.559 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.561 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922015.5584025, 128af7d9-155f-468d-9873-98c816f0df9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.561 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.564 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.568 225859 INFO nova.virt.libvirt.driver [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance spawned successfully.#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.568 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.582 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.586 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.591 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.591 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.592 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.592 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.592 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.593 225859 DEBUG nova.virt.libvirt.driver [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.624 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.624 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922015.5586474, 128af7d9-155f-468d-9873-98c816f0df9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.624 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.666 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.670 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922015.5640373, 128af7d9-155f-468d-9873-98c816f0df9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.670 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.684 225859 INFO nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Took 7.47 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.685 225859 DEBUG nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.697 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.700 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.725 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.772 225859 INFO nova.compute.manager [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Took 9.31 seconds to build instance.#033[00m
Jan 20 10:13:35 np0005588919 nova_compute[225855]: 2026-01-20 15:13:35.790 225859 DEBUG oslo_concurrency.lockutils [None req-b00a3858-a0ec-49f4-bb4d-e8705b9f8fa3 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:13:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.458 225859 DEBUG nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.459 225859 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.459 225859 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.459 225859 DEBUG oslo_concurrency.lockutils [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.460 225859 DEBUG nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:13:36 np0005588919 nova_compute[225855]: 2026-01-20 15:13:36.460 225859 WARNING nova.compute.manager [req-010f4833-51a2-4bc8-aa73-a8e0991b4476 req-79832a42-b5ed-4cf9-a046-684a26d4a521 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:13:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:36.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:37.512 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:13:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:37.513 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:13:37 np0005588919 nova_compute[225855]: 2026-01-20 15:13:37.515 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:37 np0005588919 nova_compute[225855]: 2026-01-20 15:13:37.527 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:37 np0005588919 nova_compute[225855]: 2026-01-20 15:13:37.841 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:38.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:13:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:39.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:13:40 np0005588919 podman[302339]: 2026-01-20 15:13:40.038327829 +0000 UTC m=+0.083820670 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:40 np0005588919 NetworkManager[49104]: <info>  [1768922020.0511] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 20 10:13:40 np0005588919 NetworkManager[49104]: <info>  [1768922020.0523] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.226 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:40Z|00811|binding|INFO|Releasing lport 311d5bf2-0b44-4ce1-9ec1-e7458d5df232 from this chassis (sb_readonly=0)
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.245 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.619 225859 DEBUG nova.compute.manager [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.620 225859 DEBUG nova.compute.manager [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing instance network info cache due to event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.620 225859 DEBUG oslo_concurrency.lockutils [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.621 225859 DEBUG oslo_concurrency.lockutils [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:40 np0005588919 nova_compute[225855]: 2026-01-20 15:13:40.621 225859 DEBUG nova.network.neutron [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:13:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:40.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:41.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:13:42 np0005588919 nova_compute[225855]: 2026-01-20 15:13:42.252 225859 DEBUG nova.network.neutron [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated VIF entry in instance network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:13:42 np0005588919 nova_compute[225855]: 2026-01-20 15:13:42.253 225859 DEBUG nova.network.neutron [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:42 np0005588919 nova_compute[225855]: 2026-01-20 15:13:42.274 225859 DEBUG oslo_concurrency.lockutils [req-0147313d-c76a-4813-a12e-531025211f20 req-4cd2ed76-adc9-4783-bacf-ab7430034c12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:42 np0005588919 nova_compute[225855]: 2026-01-20 15:13:42.530 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:42 np0005588919 nova_compute[225855]: 2026-01-20 15:13:42.843 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:42.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:43.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:44.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:45.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:46.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:47.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:13:47.516 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:13:47 np0005588919 nova_compute[225855]: 2026-01-20 15:13:47.534 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:47Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:1d:e9 10.100.0.4
Jan 20 10:13:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:13:47Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:1d:e9 10.100.0.4
Jan 20 10:13:47 np0005588919 nova_compute[225855]: 2026-01-20 15:13:47.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:48.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:49.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:50.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:51.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:52 np0005588919 nova_compute[225855]: 2026-01-20 15:13:52.537 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:52.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:52 np0005588919 nova_compute[225855]: 2026-01-20 15:13:52.880 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:53.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:53 np0005588919 nova_compute[225855]: 2026-01-20 15:13:53.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:53 np0005588919 nova_compute[225855]: 2026-01-20 15:13:53.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:13:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:53 np0005588919 nova_compute[225855]: 2026-01-20 15:13:53.637 225859 INFO nova.compute.manager [None req-f2b68e5e-7fe4-470f-8e46-d3b7873b26fc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Get console output#033[00m
Jan 20 10:13:53 np0005588919 nova_compute[225855]: 2026-01-20 15:13:53.642 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:13:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:54.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:55 np0005588919 podman[302476]: 2026-01-20 15:13:55.001920929 +0000 UTC m=+0.047256612 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:13:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:55.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.536 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.536 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.537 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.537 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 128af7d9-155f-468d-9873-98c816f0df9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.640 225859 INFO nova.compute.manager [None req-ac2d34ea-47e2-41f0-9cf1-393c287456c8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Get console output#033[00m
Jan 20 10:13:56 np0005588919 nova_compute[225855]: 2026-01-20 15:13:56.644 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:13:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:56.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:13:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:57.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:13:57 np0005588919 nova_compute[225855]: 2026-01-20 15:13:57.540 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:57 np0005588919 nova_compute[225855]: 2026-01-20 15:13:57.887 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:13:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.776 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.799 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.799 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.799 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.800 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:58 np0005588919 nova_compute[225855]: 2026-01-20 15:13:58.800 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:13:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.003000086s ======
Jan 20 10:13:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:13:58.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000086s
Jan 20 10:13:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:13:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:13:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:13:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:13:59 np0005588919 nova_compute[225855]: 2026-01-20 15:13:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.367 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.368 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.368 225859 DEBUG nova.network.neutron [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.397 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.397 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.397 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.398 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.399 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:00 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:00 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/916155642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.862 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:00.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.947 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:14:00 np0005588919 nova_compute[225855]: 2026-01-20 15:14:00.947 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.161 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.163 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4069MB free_disk=20.894081115722656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.163 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.164 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:14:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:01.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.356 225859 INFO nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating resource usage from migration c8ea2eca-34f1-4b31-9699-90661d5995f9#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.379 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Migration c8ea2eca-34f1-4b31-9699-90661d5995f9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.379 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.379 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.475 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.550 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.551 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.567 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.591 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:14:01 np0005588919 nova_compute[225855]: 2026-01-20 15:14:01.627 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1147698371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.073 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.076 225859 DEBUG nova.network.neutron [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.082 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.101 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.105 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.263 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.264 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.320 225859 DEBUG nova.virt.libvirt.driver [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.321 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Creating file /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/3cf538d9a8b949a0901af2d58a3d01eb.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.321 225859 DEBUG oslo_concurrency.processutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/3cf538d9a8b949a0901af2d58a3d01eb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.543 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.795 225859 DEBUG oslo_concurrency.processutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/3cf538d9a8b949a0901af2d58a3d01eb.tmp" returned: 1 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.796 225859 DEBUG oslo_concurrency.processutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/3cf538d9a8b949a0901af2d58a3d01eb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.796 225859 DEBUG nova.virt.libvirt.volume.remotefs [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Creating directory /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.797 225859 DEBUG oslo_concurrency.processutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:02 np0005588919 nova_compute[225855]: 2026-01-20 15:14:02.884 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:02.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:03 np0005588919 nova_compute[225855]: 2026-01-20 15:14:03.029 225859 DEBUG oslo_concurrency.processutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:03 np0005588919 nova_compute[225855]: 2026-01-20 15:14:03.033 225859 DEBUG nova.virt.libvirt.driver [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:14:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:03.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:03 np0005588919 nova_compute[225855]: 2026-01-20 15:14:03.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:04 np0005588919 nova_compute[225855]: 2026-01-20 15:14:04.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:04.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:05.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:05 np0005588919 kernel: tap9de5453d-b5 (unregistering): left promiscuous mode
Jan 20 10:14:05 np0005588919 NetworkManager[49104]: <info>  [1768922045.6308] device (tap9de5453d-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:14:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:05Z|00812|binding|INFO|Releasing lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf from this chassis (sb_readonly=0)
Jan 20 10:14:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:05Z|00813|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf down in Southbound
Jan 20 10:14:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:05Z|00814|binding|INFO|Removing iface tap9de5453d-b5 ovn-installed in OVS
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:05.647 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:1d:e9 10.100.0.4'], port_security=['fa:16:3e:a8:1d:e9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '128af7d9-155f-468d-9873-98c816f0df9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d07527d3-7363-453c-9902-c562bab626ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4895263-5fc5-4c5a-ab8d-547f570bc095', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4d49b6-1d42-4171-8055-0d823fb37e66, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9de5453d-b548-429c-8fc2-7b012cb8ebdf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:05.648 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9de5453d-b548-429c-8fc2-7b012cb8ebdf in datapath d07527d3-7363-453c-9902-c562bab626ba unbound from our chassis#033[00m
Jan 20 10:14:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:05.650 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d07527d3-7363-453c-9902-c562bab626ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:14:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:05.650 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[16852e8d-702b-4036-8ad3-bbfcee7441aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:05.651 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d07527d3-7363-453c-9902-c562bab626ba namespace which is not needed anymore#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.657 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:05 np0005588919 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 20 10:14:05 np0005588919 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000b7.scope: Consumed 14.404s CPU time.
Jan 20 10:14:05 np0005588919 systemd-machined[194361]: Machine qemu-96-instance-000000b7 terminated.
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.884 225859 DEBUG nova.compute.manager [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.885 225859 DEBUG oslo_concurrency.lockutils [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.885 225859 DEBUG oslo_concurrency.lockutils [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.886 225859 DEBUG oslo_concurrency.lockutils [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.886 225859 DEBUG nova.compute.manager [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:05 np0005588919 nova_compute[225855]: 2026-01-20 15:14:05.886 225859 WARNING nova.compute.manager [req-a1eda262-5399-4ec9-924e-5d5597919829 req-789f59f2-89ed-49d3-b1d9-b47f31b6876e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [NOTICE]   (302166) : haproxy version is 2.8.14-c23fe91
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [NOTICE]   (302166) : path to executable is /usr/sbin/haproxy
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [WARNING]  (302166) : Exiting Master process...
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [WARNING]  (302166) : Exiting Master process...
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [ALERT]    (302166) : Current worker (302169) exited with code 143 (Terminated)
Jan 20 10:14:05 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[302147]: [WARNING]  (302166) : All workers exited. Exiting... (0)
Jan 20 10:14:05 np0005588919 systemd[1]: libpod-9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155.scope: Deactivated successfully.
Jan 20 10:14:05 np0005588919 podman[302622]: 2026-01-20 15:14:05.942883822 +0000 UTC m=+0.204330339 container died 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 20 10:14:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155-userdata-shm.mount: Deactivated successfully.
Jan 20 10:14:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay-19af57efcf5c610cf3b2724d9be132eb303f0d0071822b6d38a243dd05cdc13e-merged.mount: Deactivated successfully.
Jan 20 10:14:05 np0005588919 podman[302622]: 2026-01-20 15:14:05.996935825 +0000 UTC m=+0.258382332 container cleanup 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:06 np0005588919 systemd[1]: libpod-conmon-9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155.scope: Deactivated successfully.
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.049 225859 INFO nova.virt.libvirt.driver [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.054 225859 INFO nova.virt.libvirt.driver [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance destroyed successfully.#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.055 225859 DEBUG nova.virt.libvirt.vif [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:13:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:13:59Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1250108698", "vif_mac": "fa:16:3e:a8:1d:e9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.055 225859 DEBUG nova.network.os_vif_util [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1250108698", "vif_mac": "fa:16:3e:a8:1d:e9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.056 225859 DEBUG nova.network.os_vif_util [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.057 225859 DEBUG os_vif [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.059 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9de5453d-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.095 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:06 np0005588919 podman[302664]: 2026-01-20 15:14:06.096148241 +0000 UTC m=+0.078499259 container remove 9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.098 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.100 225859 INFO os_vif [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5')#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.101 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[825b913a-805c-4fe6-a7c5-a3e5b6ca50aa]: (4, ('Tue Jan 20 03:14:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba (9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155)\n9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155\nTue Jan 20 03:14:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba (9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155)\n9e0f985d5a417f01c41231b07aa6e097d3d4253b45c9c2dcd897f5700b809155\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.102 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f76bce96-4d3e-43da-977d-4c6d62521cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.103 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd07527d3-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.104 225859 DEBUG nova.virt.libvirt.driver [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.104 225859 DEBUG nova.virt.libvirt.driver [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] skipping disk for instance-000000b7 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:14:06 np0005588919 kernel: tapd07527d3-70: left promiscuous mode
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.109 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.122 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[70092df2-6b4f-4f0d-a652-01e56dca6ada]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.137 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[19e707b0-7574-40dd-84d2-e28f6d1eb07b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.138 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[04fc1234-ad2c-4ca1-b3eb-02d9c8fdb91d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.153 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c186a9-4709-4f54-b1d1-0a426fd0050a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 705300, 'reachable_time': 37664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302679, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.156 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d07527d3-7363-453c-9902-c562bab626ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:14:06 np0005588919 systemd[1]: run-netns-ovnmeta\x2dd07527d3\x2d7363\x2d453c\x2d9902\x2dc562bab626ba.mount: Deactivated successfully.
Jan 20 10:14:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:06.156 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[76c6007f-f179-4ca2-ad03-617c5876f22f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.299 225859 DEBUG neutronclient.v2_0.client [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.416 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.416 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:06 np0005588919 nova_compute[225855]: 2026-01-20 15:14:06.416 225859 DEBUG oslo_concurrency.lockutils [None req-285d398b-bc35-41f1-80eb-34f9b43ae976 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:06.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.691 225859 DEBUG nova.compute.manager [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.691 225859 DEBUG nova.compute.manager [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing instance network info cache due to event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.691 225859 DEBUG oslo_concurrency.lockutils [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.691 225859 DEBUG oslo_concurrency.lockutils [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.692 225859 DEBUG nova.network.neutron [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:07 np0005588919 nova_compute[225855]: 2026-01-20 15:14:07.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.003 225859 DEBUG nova.compute.manager [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.003 225859 DEBUG oslo_concurrency.lockutils [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.004 225859 DEBUG oslo_concurrency.lockutils [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.004 225859 DEBUG oslo_concurrency.lockutils [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.004 225859 DEBUG nova.compute.manager [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.004 225859 WARNING nova.compute.manager [req-061373c2-4af0-43d6-9bcd-2aa411425f8d req-dd2b9761-cc89-488c-94cf-e4aef802ec64 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 10:14:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:08.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.994 225859 DEBUG nova.network.neutron [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated VIF entry in instance network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:08 np0005588919 nova_compute[225855]: 2026-01-20 15:14:08.995 225859 DEBUG nova.network.neutron [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:09 np0005588919 nova_compute[225855]: 2026-01-20 15:14:09.013 225859 DEBUG oslo_concurrency.lockutils [req-6c55cbfe-8bca-4b99-b488-8c07175e349c req-10bdd897-4ba8-49bf-b9ce-254d974383ed 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:09.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 20 10:14:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:10.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:11 np0005588919 podman[302683]: 2026-01-20 15:14:11.061757452 +0000 UTC m=+0.115397664 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:11.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.928 225859 DEBUG nova.compute.manager [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.928 225859 DEBUG oslo_concurrency.lockutils [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.928 225859 DEBUG oslo_concurrency.lockutils [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.928 225859 DEBUG oslo_concurrency.lockutils [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.928 225859 DEBUG nova.compute.manager [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:11 np0005588919 nova_compute[225855]: 2026-01-20 15:14:11.929 225859 WARNING nova.compute.manager [req-93ecd443-7114-407f-9823-5fa6636357e0 req-4295da27-e483-4903-be84-8847c1316b75 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:14:12 np0005588919 nova_compute[225855]: 2026-01-20 15:14:12.888 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:12.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:13.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.030 225859 DEBUG nova.compute.manager [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.031 225859 DEBUG oslo_concurrency.lockutils [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.031 225859 DEBUG oslo_concurrency.lockutils [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.031 225859 DEBUG oslo_concurrency.lockutils [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.031 225859 DEBUG nova.compute.manager [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:14 np0005588919 nova_compute[225855]: 2026-01-20 15:14:14.031 225859 WARNING nova.compute.manager [req-312b01be-d4e9-4906-9088-3b048ff4cb74 req-4f600fd2-4dcf-4a37-a270-b33776175ea4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 20 10:14:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:14.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:15.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.143 225859 DEBUG nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.144 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.144 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.144 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.145 225859 DEBUG nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.145 225859 WARNING nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.146 225859 DEBUG nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.146 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.146 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.147 225859 DEBUG oslo_concurrency.lockutils [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.147 225859 DEBUG nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.147 225859 WARNING nova.compute.manager [req-640f54c9-ef6f-4b56-a54a-7f6781e5e70a req-acf46d5b-f416-447d-a5de-eb042192bc60 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 20 10:14:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3848990267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:16.436 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:16.436 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:16.437 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.486 225859 INFO nova.compute.manager [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Swapping old allocation on dict_keys(['bbb02880-a710-4ac1-8b2c-5c09765848d1']) held by migration c8ea2eca-34f1-4b31-9699-90661d5995f9 for instance#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.531 225859 DEBUG nova.scheduler.client.report [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Overwriting current allocation {'allocations': {'068db7fd-4bd6-45a9-8bd6-a22cfe7596ed': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 90}}, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'consumer_generation': 1} on consumer 128af7d9-155f-468d-9873-98c816f0df9e move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 20 10:14:16 np0005588919 nova_compute[225855]: 2026-01-20 15:14:16.710 225859 INFO nova.network.neutron [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating port 9de5453d-b548-429c-8fc2-7b012cb8ebdf with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:14:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:16.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:17.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.688 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.688 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.689 225859 DEBUG nova.network.neutron [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.821 225859 DEBUG nova.compute.manager [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.821 225859 DEBUG nova.compute.manager [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing instance network info cache due to event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.822 225859 DEBUG oslo_concurrency.lockutils [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:17 np0005588919 nova_compute[225855]: 2026-01-20 15:14:17.890 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:18.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:19.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.072 225859 DEBUG nova.network.neutron [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.097 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.098 225859 DEBUG nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.134 225859 DEBUG oslo_concurrency.lockutils [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.135 225859 DEBUG nova.network.neutron [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.182 225859 DEBUG nova.storage.rbd_utils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rolling back rbd image(128af7d9-155f-468d-9873-98c816f0df9e_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.317 225859 DEBUG nova.storage.rbd_utils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] removing snapshot(nova-resize) on rbd image(128af7d9-155f-468d-9873-98c816f0df9e_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.870 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922045.8693986, 128af7d9-155f-468d-9873-98c816f0df9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.871 225859 INFO nova.compute.manager [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.896 225859 DEBUG nova.compute.manager [None req-f2cf59cd-4a92-4229-92a5-223081d9574d - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.899 225859 DEBUG nova.compute.manager [None req-f2cf59cd-4a92-4229-92a5-223081d9574d - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:20.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:20 np0005588919 nova_compute[225855]: 2026-01-20 15:14:20.924 225859 INFO nova.compute.manager [None req-f2cf59cd-4a92-4229-92a5-223081d9574d - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 20 10:14:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:21.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.679 225859 DEBUG nova.network.neutron [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated VIF entry in instance network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.680 225859 DEBUG nova.network.neutron [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.698 225859 DEBUG oslo_concurrency.lockutils [req-5c8a6c42-d363-41d1-993b-3458673b597a req-e041dbdf-95b4-4d3f-a85c-5f3c6d6d7d01 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.862 225859 DEBUG nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start _get_guest_xml network_info=[{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.866 225859 WARNING nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.872 225859 DEBUG nova.virt.libvirt.host [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.873 225859 DEBUG nova.virt.libvirt.host [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.876 225859 DEBUG nova.virt.libvirt.host [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.876 225859 DEBUG nova.virt.libvirt.host [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.878 225859 DEBUG nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.878 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.879 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.879 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.879 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.879 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.880 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.880 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.880 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.881 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.881 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.881 225859 DEBUG nova.virt.hardware [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.882 225859 DEBUG nova.objects.instance [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128af7d9-155f-468d-9873-98c816f0df9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:21 np0005588919 nova_compute[225855]: 2026-01-20 15:14:21.900 225859 DEBUG oslo_concurrency.processutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/714890073' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.326 225859 DEBUG oslo_concurrency.processutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.361 225859 DEBUG oslo_concurrency.processutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4287932142' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.843 225859 DEBUG oslo_concurrency.processutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.847 225859 DEBUG nova.virt.libvirt.vif [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:14:13Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.848 225859 DEBUG nova.network.os_vif_util [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.849 225859 DEBUG nova.network.os_vif_util [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.855 225859 DEBUG nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <uuid>128af7d9-155f-468d-9873-98c816f0df9e</uuid>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <name>instance-000000b7</name>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1998945962</nova:name>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:14:21</nova:creationTime>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <nova:port uuid="9de5453d-b548-429c-8fc2-7b012cb8ebdf">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="serial">128af7d9-155f-468d-9873-98c816f0df9e</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="uuid">128af7d9-155f-468d-9873-98c816f0df9e</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/128af7d9-155f-468d-9873-98c816f0df9e_disk">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/128af7d9-155f-468d-9873-98c816f0df9e_disk.config">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a8:1d:e9"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <target dev="tap9de5453d-b5"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e/console.log" append="off"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:14:22 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:14:22 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:14:22 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:14:22 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.857 225859 DEBUG nova.compute.manager [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Preparing to wait for external event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.858 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.858 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.858 225859 DEBUG oslo_concurrency.lockutils [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.859 225859 DEBUG nova.virt.libvirt.vif [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:14:13Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.859 225859 DEBUG nova.network.os_vif_util [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.860 225859 DEBUG nova.network.os_vif_util [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.861 225859 DEBUG os_vif [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.862 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.862 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.865 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.865 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9de5453d-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.866 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9de5453d-b5, col_values=(('external_ids', {'iface-id': '9de5453d-b548-429c-8fc2-7b012cb8ebdf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:1d:e9', 'vm-uuid': '128af7d9-155f-468d-9873-98c816f0df9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.867 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 NetworkManager[49104]: <info>  [1768922062.8688] manager: (tap9de5453d-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.870 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.874 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.875 225859 INFO os_vif [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5')#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.892 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:22.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:22 np0005588919 kernel: tap9de5453d-b5: entered promiscuous mode
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 NetworkManager[49104]: <info>  [1768922062.9621] manager: (tap9de5453d-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/339)
Jan 20 10:14:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:22Z|00815|binding|INFO|Claiming lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf for this chassis.
Jan 20 10:14:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:22Z|00816|binding|INFO|9de5453d-b548-429c-8fc2-7b012cb8ebdf: Claiming fa:16:3e:a8:1d:e9 10.100.0.4
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.972 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:1d:e9 10.100.0.4'], port_security=['fa:16:3e:a8:1d:e9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '128af7d9-155f-468d-9873-98c816f0df9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d07527d3-7363-453c-9902-c562bab626ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'b4895263-5fc5-4c5a-ab8d-547f570bc095', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.233'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4d49b6-1d42-4171-8055-0d823fb37e66, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9de5453d-b548-429c-8fc2-7b012cb8ebdf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.973 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9de5453d-b548-429c-8fc2-7b012cb8ebdf in datapath d07527d3-7363-453c-9902-c562bab626ba bound to our chassis#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.975 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d07527d3-7363-453c-9902-c562bab626ba#033[00m
Jan 20 10:14:22 np0005588919 systemd-udevd[302896]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:22Z|00817|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf ovn-installed in OVS
Jan 20 10:14:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:22Z|00818|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf up in Southbound
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 nova_compute[225855]: 2026-01-20 15:14:22.990 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.987 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba0a4ae-117a-4940-aa10-6e15a30c274d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.991 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd07527d3-71 in ovnmeta-d07527d3-7363-453c-9902-c562bab626ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.993 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd07527d3-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.993 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4be3eb9a-31e4-4701-9800-c056157a30f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:22.995 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2d8bf542-035a-45b9-a6cb-2aca81b5a439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 NetworkManager[49104]: <info>  [1768922063.0005] device (tap9de5453d-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:14:23 np0005588919 NetworkManager[49104]: <info>  [1768922063.0012] device (tap9de5453d-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:14:23 np0005588919 systemd-machined[194361]: New machine qemu-97-instance-000000b7.
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.009 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[b3578506-587d-491b-adcd-d90e8514071f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 systemd[1]: Started Virtual Machine qemu-97-instance-000000b7.
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.034 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a79f83fb-bb32-4f45-a772-420547a81fe7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.063 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[621d68da-4b89-4216-901a-b42184ff9f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 NetworkManager[49104]: <info>  [1768922063.0685] manager: (tapd07527d3-70): new Veth device (/org/freedesktop/NetworkManager/Devices/340)
Jan 20 10:14:23 np0005588919 systemd-udevd[302900]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.069 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2d8dd71e-e502-414a-9204-25f217da1890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.097 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[bd03d32e-3b00-44c9-9e11-81634d633973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.100 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8f166daf-3e31-4aad-a944-9462fd663beb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 NetworkManager[49104]: <info>  [1768922063.1232] device (tapd07527d3-70): carrier: link connected
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.128 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[13ef60ac-615e-4436-bac4-ee8986b5af93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.143 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a46163fb-474f-422b-bbde-e41b415f8985]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd07527d3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:33:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710205, 'reachable_time': 44044, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302931, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.157 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f9dca58b-438c-4316-bd19-b08faf9e464c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:33a8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710205, 'tstamp': 710205}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302932, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.173 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6eecd527-66e0-47de-ab90-3a831be619d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd07527d3-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:33:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710205, 'reachable_time': 44044, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302933, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.200 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e801fbd1-7e7f-47e1-b0c0-4e3ed6a6c9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:23.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.260 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce71dea-1e4a-41a5-ab05-e7e8f755b70c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.262 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd07527d3-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.262 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.263 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd07527d3-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:23 np0005588919 NetworkManager[49104]: <info>  [1768922063.2652] manager: (tapd07527d3-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 20 10:14:23 np0005588919 kernel: tapd07527d3-70: entered promiscuous mode
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.268 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd07527d3-70, col_values=(('external_ids', {'iface-id': '311d5bf2-0b44-4ce1-9ec1-e7458d5df232'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:23Z|00819|binding|INFO|Releasing lport 311d5bf2-0b44-4ce1-9ec1-e7458d5df232 from this chassis (sb_readonly=0)
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.270 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.271 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[23d87075-2e18-4799-ab87-b71a39919ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.271 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-d07527d3-7363-453c-9902-c562bab626ba
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/d07527d3-7363-453c-9902-c562bab626ba.pid.haproxy
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID d07527d3-7363-453c-9902-c562bab626ba
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:14:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:23.272 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'env', 'PROCESS_TAG=haproxy-d07527d3-7363-453c-9902-c562bab626ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d07527d3-7363-453c-9902-c562bab626ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.421 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922063.4213808, 128af7d9-155f-468d-9873-98c816f0df9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.422 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.451 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.455 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922063.4215171, 128af7d9-155f-468d-9873-98c816f0df9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.455 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.475 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.478 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.504 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 10:14:23 np0005588919 podman[303007]: 2026-01-20 15:14:23.586530574 +0000 UTC m=+0.022832299 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.687 225859 DEBUG nova.compute.manager [req-0bb21d69-33bb-4488-b61e-e71fda433142 req-af923cfc-861a-47cf-8b85-26ef2f466681 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.687 225859 DEBUG oslo_concurrency.lockutils [req-0bb21d69-33bb-4488-b61e-e71fda433142 req-af923cfc-861a-47cf-8b85-26ef2f466681 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.687 225859 DEBUG oslo_concurrency.lockutils [req-0bb21d69-33bb-4488-b61e-e71fda433142 req-af923cfc-861a-47cf-8b85-26ef2f466681 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.687 225859 DEBUG oslo_concurrency.lockutils [req-0bb21d69-33bb-4488-b61e-e71fda433142 req-af923cfc-861a-47cf-8b85-26ef2f466681 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.688 225859 DEBUG nova.compute.manager [req-0bb21d69-33bb-4488-b61e-e71fda433142 req-af923cfc-861a-47cf-8b85-26ef2f466681 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Processing event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.688 225859 DEBUG nova.compute.manager [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.691 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922063.6912546, 128af7d9-155f-468d-9873-98c816f0df9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.691 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.695 225859 INFO nova.virt.libvirt.driver [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance running successfully.#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.695 225859 DEBUG nova.virt.libvirt.driver [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.733 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.737 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:23 np0005588919 podman[303007]: 2026-01-20 15:14:23.765035419 +0000 UTC m=+0.201337124 container create b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.777 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 20 10:14:23 np0005588919 nova_compute[225855]: 2026-01-20 15:14:23.801 225859 INFO nova.compute.manager [None req-ee866e29-0f1e-4cbf-b426-74bd65a7b594 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance to original state: 'active'#033[00m
Jan 20 10:14:23 np0005588919 systemd[1]: Started libpod-conmon-b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142.scope.
Jan 20 10:14:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:14:23 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a54fb807962a544aa2982518e645db6ec417e79968889ef69caab0fed7c38d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:14:23 np0005588919 podman[303007]: 2026-01-20 15:14:23.861483306 +0000 UTC m=+0.297785021 container init b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 20 10:14:23 np0005588919 podman[303007]: 2026-01-20 15:14:23.866905419 +0000 UTC m=+0.303207124 container start b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 10:14:23 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [NOTICE]   (303028) : New worker (303030) forked
Jan 20 10:14:23 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [NOTICE]   (303028) : Loading success.
Jan 20 10:14:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:24.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:25.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.361 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.379 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.795 225859 DEBUG nova.compute.manager [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.795 225859 DEBUG oslo_concurrency.lockutils [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.795 225859 DEBUG oslo_concurrency.lockutils [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.796 225859 DEBUG oslo_concurrency.lockutils [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.796 225859 DEBUG nova.compute.manager [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:25 np0005588919 nova_compute[225855]: 2026-01-20 15:14:25.796 225859 WARNING nova.compute.manager [req-3f502ff3-ee76-489f-a057-779cfb8e42b0 req-ec796712-68a0-4732-85d5-dd79241a8bf1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:14:26 np0005588919 podman[303040]: 2026-01-20 15:14:26.003653116 +0000 UTC m=+0.045250855 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:14:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 20 10:14:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:26.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:27.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:27 np0005588919 nova_compute[225855]: 2026-01-20 15:14:27.869 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:27 np0005588919 nova_compute[225855]: 2026-01-20 15:14:27.896 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:28.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:29.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:30.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:31.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 20 10:14:32 np0005588919 nova_compute[225855]: 2026-01-20 15:14:32.872 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:32 np0005588919 nova_compute[225855]: 2026-01-20 15:14:32.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:32.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:33.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:34.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:35.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:36.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:37.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.754 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.754 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.773 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.839 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.840 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.846 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.846 225859 INFO nova.compute.claims [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.875 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:37 np0005588919 nova_compute[225855]: 2026-01-20 15:14:37.987 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:38 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:38Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:1d:e9 10.100.0.4
Jan 20 10:14:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3112583804' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.512 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.519 225859 DEBUG nova.compute.provider_tree [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.535 225859 DEBUG nova.scheduler.client.report [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.563 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.564 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.622 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.623 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.644 225859 INFO nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.671 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.731 225859 INFO nova.virt.block_device [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Booting with volume 4994c109-f7d8-4642-bf6a-2f796e3851ba at /dev/vda#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.911 225859 DEBUG os_brick.utils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.912 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.926 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.926 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[42f04a66-1c2c-4bdc-830f-62e978504540]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.928 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.937 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.937 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[03d58a46-de6b-478a-bb5d-ed2120511c31]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.940 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:38.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.951 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.952 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca4cba2-a11d-4d67-8d94-d6c96858c8b3]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.953 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[bdce2a29-abaa-4c98-9bcd-eb5bdbd4f81f]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.954 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.989 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "nvme version" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.992 225859 DEBUG os_brick.initiator.connectors.lightos [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.992 225859 DEBUG os_brick.initiator.connectors.lightos [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.992 225859 DEBUG os_brick.initiator.connectors.lightos [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.992 225859 DEBUG os_brick.utils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] <== get_connector_properties: return (81ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:14:38 np0005588919 nova_compute[225855]: 2026-01-20 15:14:38.993 225859 DEBUG nova.virt.block_device [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating existing volume attachment record: b71943fe-2a88-4bcf-8d86-3af9f4ede56c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:14:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:39.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:39 np0005588919 nova_compute[225855]: 2026-01-20 15:14:39.488 225859 DEBUG nova.policy [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf422e55e158420cbdae75f07a3bb97a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a49638950e1543fa8e0d251af5479623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.014 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.016 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.017 225859 INFO nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Creating image(s)#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.017 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.018 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Ensure instance console log exists: /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.019 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.019 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.020 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:40 np0005588919 nova_compute[225855]: 2026-01-20 15:14:40.236 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Successfully created port: 2cfaf09f-1f9e-489f-b7d3-43166c005796 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:14:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:40.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:41.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.386 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Successfully updated port: 2cfaf09f-1f9e-489f-b7d3-43166c005796 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.411 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.411 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquired lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.411 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.477 225859 DEBUG nova.compute.manager [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.478 225859 DEBUG nova.compute.manager [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing instance network info cache due to event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.478 225859 DEBUG oslo_concurrency.lockutils [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:41 np0005588919 nova_compute[225855]: 2026-01-20 15:14:41.548 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:14:42 np0005588919 podman[303147]: 2026-01-20 15:14:42.03844322 +0000 UTC m=+0.085710073 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.567 225859 DEBUG nova.network.neutron [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.588 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Releasing lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.589 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Instance network_info: |[{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.589 225859 DEBUG oslo_concurrency.lockutils [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.590 225859 DEBUG nova.network.neutron [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.594 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Start _get_guest_xml network_info=[{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4994c109-f7d8-4642-bf6a-2f796e3851ba', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4994c109-f7d8-4642-bf6a-2f796e3851ba', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '185fbaf7-4372-4e7c-b053-df9c4022514f', 'attached_at': '', 'detached_at': '', 'volume_id': '4994c109-f7d8-4642-bf6a-2f796e3851ba', 'serial': '4994c109-f7d8-4642-bf6a-2f796e3851ba'}, 'guest_format': None, 'boot_index': 0, 'mount_device': '/dev/vda', 'attachment_id': 'b71943fe-2a88-4bcf-8d86-3af9f4ede56c', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.599 225859 WARNING nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.605 225859 DEBUG nova.virt.libvirt.host [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.606 225859 DEBUG nova.virt.libvirt.host [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.615 225859 DEBUG nova.virt.libvirt.host [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.616 225859 DEBUG nova.virt.libvirt.host [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.617 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.617 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.618 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.618 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.618 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.618 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.619 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.619 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.619 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.619 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.619 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.620 225859 DEBUG nova.virt.hardware [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.879 225859 DEBUG nova.storage.rbd_utils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.883 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.907 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:42 np0005588919 nova_compute[225855]: 2026-01-20 15:14:42.909 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:42.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:43.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2934712307' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.314 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.341 225859 DEBUG nova.virt.libvirt.vif [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:14:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1033173342',display_name='tempest-TestVolumeBootPattern-server-1033173342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1033173342',id=188,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-pz9crh7i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:14:38Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=185fbaf7-4372-4e7c-b053-df9c4022514f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.341 225859 DEBUG nova.network.os_vif_util [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.342 225859 DEBUG nova.network.os_vif_util [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.344 225859 DEBUG nova.objects.instance [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'pci_devices' on Instance uuid 185fbaf7-4372-4e7c-b053-df9c4022514f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.362 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <uuid>185fbaf7-4372-4e7c-b053-df9c4022514f</uuid>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <name>instance-000000bc</name>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestVolumeBootPattern-server-1033173342</nova:name>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:14:42</nova:creationTime>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:user uuid="bf422e55e158420cbdae75f07a3bb97a">tempest-TestVolumeBootPattern-194644003-project-member</nova:user>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:project uuid="a49638950e1543fa8e0d251af5479623">tempest-TestVolumeBootPattern-194644003</nova:project>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <nova:port uuid="2cfaf09f-1f9e-489f-b7d3-43166c005796">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="serial">185fbaf7-4372-4e7c-b053-df9c4022514f</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="uuid">185fbaf7-4372-4e7c-b053-df9c4022514f</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-4994c109-f7d8-4642-bf6a-2f796e3851ba">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <serial>4994c109-f7d8-4642-bf6a-2f796e3851ba</serial>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:6b:b0:3d"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <target dev="tap2cfaf09f-1f"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/console.log" append="off"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:14:43 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:14:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:14:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:14:43 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.364 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Preparing to wait for external event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.364 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.364 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.365 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.366 225859 DEBUG nova.virt.libvirt.vif [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:14:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1033173342',display_name='tempest-TestVolumeBootPattern-server-1033173342',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1033173342',id=188,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-pz9crh7i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:14:38Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=185fbaf7-4372-4e7c-b053-df9c4022514f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.366 225859 DEBUG nova.network.os_vif_util [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.367 225859 DEBUG nova.network.os_vif_util [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.367 225859 DEBUG os_vif [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.367 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.368 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.368 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.371 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.371 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cfaf09f-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.371 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2cfaf09f-1f, col_values=(('external_ids', {'iface-id': '2cfaf09f-1f9e-489f-b7d3-43166c005796', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:b0:3d', 'vm-uuid': '185fbaf7-4372-4e7c-b053-df9c4022514f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.373 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:43 np0005588919 NetworkManager[49104]: <info>  [1768922083.3741] manager: (tap2cfaf09f-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.375 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.379 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.380 225859 INFO os_vif [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f')#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.455 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.455 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.455 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] No VIF found with MAC fa:16:3e:6b:b0:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.456 225859 INFO nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Using config drive#033[00m
Jan 20 10:14:43 np0005588919 nova_compute[225855]: 2026-01-20 15:14:43.480 225859 DEBUG nova.storage.rbd_utils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.153 225859 INFO nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Creating config drive at /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.158 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpznc8anpu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.294 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpznc8anpu" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.323 225859 DEBUG nova.storage.rbd_utils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] rbd image 185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.326 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config 185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.487 225859 DEBUG oslo_concurrency.processutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config 185fbaf7-4372-4e7c-b053-df9c4022514f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.488 225859 INFO nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Deleting local config drive /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f/disk.config because it was imported into RBD.#033[00m
Jan 20 10:14:44 np0005588919 kernel: tap2cfaf09f-1f: entered promiscuous mode
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.5334] manager: (tap2cfaf09f-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/343)
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.534 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:44Z|00820|binding|INFO|Claiming lport 2cfaf09f-1f9e-489f-b7d3-43166c005796 for this chassis.
Jan 20 10:14:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:44Z|00821|binding|INFO|2cfaf09f-1f9e-489f-b7d3-43166c005796: Claiming fa:16:3e:6b:b0:3d 10.100.0.3
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.543 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:b0:3d 10.100.0.3'], port_security=['fa:16:3e:6b:b0:3d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '185fbaf7-4372-4e7c-b053-df9c4022514f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c29da5ec-6cb2-4047-ba89-70fa67a96476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2cfaf09f-1f9e-489f-b7d3-43166c005796) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.544 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2cfaf09f-1f9e-489f-b7d3-43166c005796 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 bound to our chassis#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.545 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b677f1a9-dbaa-4373-8466-bd9ccf067b91#033[00m
Jan 20 10:14:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:44Z|00822|binding|INFO|Setting lport 2cfaf09f-1f9e-489f-b7d3-43166c005796 ovn-installed in OVS
Jan 20 10:14:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:44Z|00823|binding|INFO|Setting lport 2cfaf09f-1f9e-489f-b7d3-43166c005796 up in Southbound
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.550 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.553 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.555 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[332f0062-fa47-4da9-a50f-5c588acccd04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.556 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb677f1a9-d1 in ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.558 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb677f1a9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.558 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b99b4ee8-d743-4d7a-84b3-af594a981f6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.558 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c3d6c7-218b-4057-8d73-4d8141573af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 systemd-udevd[303418]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:44 np0005588919 systemd-machined[194361]: New machine qemu-98-instance-000000bc.
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.569 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6a9d6c-f6b5-4295-9cd8-72d8c8e2890b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.5775] device (tap2cfaf09f-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.5781] device (tap2cfaf09f-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:14:44 np0005588919 systemd[1]: Started Virtual Machine qemu-98-instance-000000bc.
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.594 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2255af-eee8-40a5-a661-a3b6da2e4730]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.618 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[658aa737-f868-470a-945b-f8326ca78a74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 systemd-udevd[303421]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.6240] manager: (tapb677f1a9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/344)
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.624 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a809a2ea-a696-47e5-a53e-6ec8e28c5a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.627 225859 INFO nova.compute.manager [None req-983ad604-747f-4a92-9aae-4e849d5f72a5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Get console output#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.637 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.653 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[27588c71-c4f7-47b6-ab63-1abfe88bbee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.655 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5a6310-4607-4a26-8914-42def5f87971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.6769] device (tapb677f1a9-d0): carrier: link connected
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.684 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb25818-0689-40aa-8297-04d38ea65062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.701 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f1da6dc7-fb23-49d4-a447-108c2eea9ce8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712360, 'reachable_time': 20858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303449, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.717 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc33ea8-d16c-4f52-a40b-2057843d50e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:c834'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 712360, 'tstamp': 712360}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303450, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.736 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[20db29bd-417e-4989-a1fc-eabe04c2188b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb677f1a9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:c8:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712360, 'reachable_time': 20858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 303451, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.764 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7b70a8-afc1-432b-9e0d-9cfa4b8f3e82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.822 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45494294-a7fc-41b1-8f84-cf24f069232d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.823 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.824 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.824 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb677f1a9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.826 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 kernel: tapb677f1a9-d0: entered promiscuous mode
Jan 20 10:14:44 np0005588919 NetworkManager[49104]: <info>  [1768922084.8270] manager: (tapb677f1a9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.830 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.831 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb677f1a9-d0, col_values=(('external_ids', {'iface-id': '1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.832 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:44Z|00824|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:14:44 np0005588919 nova_compute[225855]: 2026-01-20 15:14:44.846 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.847 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.849 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[39b99b1e-2203-4641-bf55-cfb0710d0f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.850 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b677f1a9-dbaa-4373-8466-bd9ccf067b91.pid.haproxy
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b677f1a9-dbaa-4373-8466-bd9ccf067b91
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:14:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:44.851 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'env', 'PROCESS_TAG=haproxy-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b677f1a9-dbaa-4373-8466-bd9ccf067b91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:14:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:44.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.095 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922085.0951176, 185fbaf7-4372-4e7c-b053-df9c4022514f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.096 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] VM Started (Lifecycle Event)#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.118 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.122 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922085.095241, 185fbaf7-4372-4e7c-b053-df9c4022514f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.122 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.139 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.142 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.157 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:14:45 np0005588919 podman[303523]: 2026-01-20 15:14:45.20338617 +0000 UTC m=+0.050862804 container create f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:45 np0005588919 systemd[1]: Started libpod-conmon-f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9.scope.
Jan 20 10:14:45 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:14:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:45 np0005588919 podman[303523]: 2026-01-20 15:14:45.17554265 +0000 UTC m=+0.023019304 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:14:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:45.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:45 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67b20a5a2fd7710889925edaaefc37d2cd36bbde3ed5e1de89395074d35e2d88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:14:45 np0005588919 podman[303523]: 2026-01-20 15:14:45.281909178 +0000 UTC m=+0.129385832 container init f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:14:45 np0005588919 podman[303523]: 2026-01-20 15:14:45.289499323 +0000 UTC m=+0.136975947 container start f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:14:45 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [NOTICE]   (303542) : New worker (303544) forked
Jan 20 10:14:45 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [NOTICE]   (303542) : Loading success.
Jan 20 10:14:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:45.717 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:45.719 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:14:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:45.720 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.802 225859 DEBUG nova.network.neutron [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updated VIF entry in instance network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.803 225859 DEBUG nova.network.neutron [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:45 np0005588919 nova_compute[225855]: 2026-01-20 15:14:45.817 225859 DEBUG oslo_concurrency.lockutils [req-63d90287-066a-4018-8e15-22f333d63078 req-85c42175-22fc-44e7-a425-2b7eb594f8fa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.652 225859 DEBUG nova.compute.manager [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.654 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.654 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.655 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.655 225859 DEBUG nova.compute.manager [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Processing event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.656 225859 DEBUG nova.compute.manager [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.657 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.657 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.658 225859 DEBUG oslo_concurrency.lockutils [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.658 225859 DEBUG nova.compute.manager [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] No waiting events found dispatching network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.659 225859 WARNING nova.compute.manager [req-d4c28f7f-a0b8-45b8-8a81-dbcf691e9a7e req-9c1f92a3-b6ae-4f83-a076-e866f67d8b74 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received unexpected event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 for instance with vm_state building and task_state spawning.#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.660 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.665 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922086.6655588, 185fbaf7-4372-4e7c-b053-df9c4022514f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.666 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.668 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.671 225859 INFO nova.virt.libvirt.driver [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Instance spawned successfully.#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.672 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.690 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.700 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.703 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.703 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.704 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.704 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.704 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.705 225859 DEBUG nova.virt.libvirt.driver [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.781 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.824 225859 INFO nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.825 225859 DEBUG nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.881 225859 INFO nova.compute.manager [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Took 9.06 seconds to build instance.#033[00m
Jan 20 10:14:46 np0005588919 nova_compute[225855]: 2026-01-20 15:14:46.897 225859 DEBUG oslo_concurrency.lockutils [None req-a239317b-94da-476d-aa13-27e5dba4e4e5 bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:46.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.180 225859 DEBUG nova.compute.manager [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.180 225859 DEBUG nova.compute.manager [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing instance network info cache due to event network-changed-9de5453d-b548-429c-8fc2-7b012cb8ebdf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.181 225859 DEBUG oslo_concurrency.lockutils [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.181 225859 DEBUG oslo_concurrency.lockutils [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.181 225859 DEBUG nova.network.neutron [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Refreshing network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.225 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.226 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.226 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.226 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.226 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.227 225859 INFO nova.compute.manager [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Terminating instance#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.228 225859 DEBUG nova.compute.manager [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:14:47 np0005588919 kernel: tap9de5453d-b5 (unregistering): left promiscuous mode
Jan 20 10:14:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:47.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:47 np0005588919 NetworkManager[49104]: <info>  [1768922087.2763] device (tap9de5453d-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.288 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.293 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.296 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:1d:e9 10.100.0.4'], port_security=['fa:16:3e:a8:1d:e9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '128af7d9-155f-468d-9873-98c816f0df9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d07527d3-7363-453c-9902-c562bab626ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'b4895263-5fc5-4c5a-ab8d-547f570bc095', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4d49b6-1d42-4171-8055-0d823fb37e66, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9de5453d-b548-429c-8fc2-7b012cb8ebdf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.297 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9de5453d-b548-429c-8fc2-7b012cb8ebdf in datapath d07527d3-7363-453c-9902-c562bab626ba unbound from our chassis#033[00m
Jan 20 10:14:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:47Z|00825|binding|INFO|Releasing lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf from this chassis (sb_readonly=0)
Jan 20 10:14:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:47Z|00826|binding|INFO|Setting lport 9de5453d-b548-429c-8fc2-7b012cb8ebdf down in Southbound
Jan 20 10:14:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:47Z|00827|binding|INFO|Removing iface tap9de5453d-b5 ovn-installed in OVS
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.298 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d07527d3-7363-453c-9902-c562bab626ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.300 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee2a644-f1a1-41a2-b368-c1f5690994fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.300 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d07527d3-7363-453c-9902-c562bab626ba namespace which is not needed anymore#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 20 10:14:47 np0005588919 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000b7.scope: Consumed 13.516s CPU time.
Jan 20 10:14:47 np0005588919 systemd-machined[194361]: Machine qemu-97-instance-000000b7 terminated.
Jan 20 10:14:47 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [NOTICE]   (303028) : haproxy version is 2.8.14-c23fe91
Jan 20 10:14:47 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [NOTICE]   (303028) : path to executable is /usr/sbin/haproxy
Jan 20 10:14:47 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [WARNING]  (303028) : Exiting Master process...
Jan 20 10:14:47 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [ALERT]    (303028) : Current worker (303030) exited with code 143 (Terminated)
Jan 20 10:14:47 np0005588919 neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba[303024]: [WARNING]  (303028) : All workers exited. Exiting... (0)
Jan 20 10:14:47 np0005588919 systemd[1]: libpod-b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142.scope: Deactivated successfully.
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.449 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 podman[303575]: 2026-01-20 15:14:47.455073538 +0000 UTC m=+0.044346859 container died b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.454 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.459 225859 INFO nova.virt.libvirt.driver [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Instance destroyed successfully.#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.460 225859 DEBUG nova.objects.instance [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 128af7d9-155f-468d-9873-98c816f0df9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.475 225859 DEBUG nova.virt.libvirt.vif [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1998945962',display_name='tempest-TestNetworkAdvancedServerOps-server-1998945962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1998945962',id=183,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQtn9WZwWHpZ18eEtTY9zdPNbJgOayUdrvVmR1brDMxwKaiJ8tf9lOFdht6GjVy3Orpnh5Z5LatI7xEKad9rNtjFmwEczk5s4CmWp5ueE54bJ73h+pph+yq2VHvIP5rgg==',key_name='tempest-TestNetworkAdvancedServerOps-1645169738',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-mvt7thmt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:14:23Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=128af7d9-155f-468d-9873-98c816f0df9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.476 225859 DEBUG nova.network.os_vif_util [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.233", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.478 225859 DEBUG nova.network.os_vif_util [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.479 225859 DEBUG os_vif [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142-userdata-shm.mount: Deactivated successfully.
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.481 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9de5453d-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:47 np0005588919 systemd[1]: var-lib-containers-storage-overlay-2a54fb807962a544aa2982518e645db6ec417e79968889ef69caab0fed7c38d3-merged.mount: Deactivated successfully.
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.490 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.492 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.495 225859 INFO os_vif [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:1d:e9,bridge_name='br-int',has_traffic_filtering=True,id=9de5453d-b548-429c-8fc2-7b012cb8ebdf,network=Network(d07527d3-7363-453c-9902-c562bab626ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9de5453d-b5')#033[00m
Jan 20 10:14:47 np0005588919 podman[303575]: 2026-01-20 15:14:47.50201792 +0000 UTC m=+0.091291211 container cleanup b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:14:47 np0005588919 systemd[1]: libpod-conmon-b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142.scope: Deactivated successfully.
Jan 20 10:14:47 np0005588919 podman[303633]: 2026-01-20 15:14:47.570582445 +0000 UTC m=+0.039628315 container remove b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.575 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e73dcf67-7b08-46d2-944e-7a1ac96fa862]: (4, ('Tue Jan 20 03:14:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba (b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142)\nb478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142\nTue Jan 20 03:14:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d07527d3-7363-453c-9902-c562bab626ba (b478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142)\nb478c0ef6bae9eea7c30e6f5fef3da31b6628e87efed91da6febae4d8fa3f142\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.577 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d4cc0d-ba4a-4c30-ac7d-546245c3cf14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.578 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd07527d3-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:14:47 np0005588919 kernel: tapd07527d3-70: left promiscuous mode
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.587 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99e8ddc0-8138-4d70-ae59-12f5f6eac65a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.586 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.605 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0c678576-28d3-49e4-8c5a-02551033c470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.606 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[93c5362c-ea2d-4748-91f6-fc742b25955f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.621 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6800e9d9-882d-4a1d-bf3d-70ad38317da7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710198, 'reachable_time': 44596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303649, 'error': None, 'target': 'ovnmeta-d07527d3-7363-453c-9902-c562bab626ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 systemd[1]: run-netns-ovnmeta\x2dd07527d3\x2d7363\x2d453c\x2d9902\x2dc562bab626ba.mount: Deactivated successfully.
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.630 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d07527d3-7363-453c-9902-c562bab626ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:14:47 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:14:47.630 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab4fb60-7896-4556-906a-432d4ca1c1cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.877 225859 INFO nova.virt.libvirt.driver [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Deleting instance files /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e_del#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.879 225859 INFO nova.virt.libvirt.driver [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Deletion of /var/lib/nova/instances/128af7d9-155f-468d-9873-98c816f0df9e_del complete#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.902 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.922 225859 DEBUG nova.compute.manager [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.923 225859 DEBUG oslo_concurrency.lockutils [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.924 225859 DEBUG oslo_concurrency.lockutils [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.924 225859 DEBUG oslo_concurrency.lockutils [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.924 225859 DEBUG nova.compute.manager [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.925 225859 DEBUG nova.compute.manager [req-1868f13c-2345-42d8-8d9a-6b0772297bc8 req-0d36d7ce-b979-476d-91d2-c122ffa55740 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-unplugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.936 225859 INFO nova.compute.manager [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.937 225859 DEBUG oslo.service.loopingcall [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.938 225859 DEBUG nova.compute.manager [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:14:47 np0005588919 nova_compute[225855]: 2026-01-20 15:14:47.938 225859 DEBUG nova.network.neutron [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:14:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:48.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:49.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.541 225859 DEBUG nova.network.neutron [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updated VIF entry in instance network info cache for port 9de5453d-b548-429c-8fc2-7b012cb8ebdf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.541 225859 DEBUG nova.network.neutron [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [{"id": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "address": "fa:16:3e:a8:1d:e9", "network": {"id": "d07527d3-7363-453c-9902-c562bab626ba", "bridge": "br-int", "label": "tempest-network-smoke--1250108698", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9de5453d-b5", "ovs_interfaceid": "9de5453d-b548-429c-8fc2-7b012cb8ebdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.565 225859 DEBUG oslo_concurrency.lockutils [req-cbd2da10-50c5-41f0-b250-bac139562e5f req-0bd37af1-7b64-4bc0-a1c0-d1e0c9eeae43 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-128af7d9-155f-468d-9873-98c816f0df9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.696 225859 DEBUG nova.network.neutron [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.731 225859 INFO nova.compute.manager [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Took 1.79 seconds to deallocate network for instance.#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.771 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.771 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:49 np0005588919 nova_compute[225855]: 2026-01-20 15:14:49.871 225859 DEBUG oslo_concurrency.processutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.017 225859 DEBUG nova.compute.manager [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.018 225859 DEBUG oslo_concurrency.lockutils [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "128af7d9-155f-468d-9873-98c816f0df9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.019 225859 DEBUG oslo_concurrency.lockutils [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.019 225859 DEBUG oslo_concurrency.lockutils [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.020 225859 DEBUG nova.compute.manager [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] No waiting events found dispatching network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.020 225859 WARNING nova.compute.manager [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received unexpected event network-vif-plugged-9de5453d-b548-429c-8fc2-7b012cb8ebdf for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.020 225859 DEBUG nova.compute.manager [req-a4f3f775-f582-4a97-bfc3-0608095dbe7b req-75459025-193d-4d8c-8806-0c5ac05d5b5d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Received event network-vif-deleted-9de5453d-b548-429c-8fc2-7b012cb8ebdf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:50 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:14:50 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1825238396' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.315 225859 DEBUG oslo_concurrency.processutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.321 225859 DEBUG nova.compute.provider_tree [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.344 225859 DEBUG nova.scheduler.client.report [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.378 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.462 225859 INFO nova.scheduler.client.report [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance 128af7d9-155f-468d-9873-98c816f0df9e#033[00m
Jan 20 10:14:50 np0005588919 nova_compute[225855]: 2026-01-20 15:14:50.530 225859 DEBUG oslo_concurrency.lockutils [None req-0d7f3870-f13b-4e1a-8e57-f04035bf7dfc 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "128af7d9-155f-468d-9873-98c816f0df9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:14:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:14:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:50.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:14:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:51.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:51 np0005588919 nova_compute[225855]: 2026-01-20 15:14:51.652 225859 DEBUG nova.compute.manager [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:14:51 np0005588919 nova_compute[225855]: 2026-01-20 15:14:51.653 225859 DEBUG nova.compute.manager [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing instance network info cache due to event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:14:51 np0005588919 nova_compute[225855]: 2026-01-20 15:14:51.653 225859 DEBUG oslo_concurrency.lockutils [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:51 np0005588919 nova_compute[225855]: 2026-01-20 15:14:51.654 225859 DEBUG oslo_concurrency.lockutils [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:51 np0005588919 nova_compute[225855]: 2026-01-20 15:14:51.654 225859 DEBUG nova.network.neutron [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:14:52 np0005588919 nova_compute[225855]: 2026-01-20 15:14:52.488 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:52 np0005588919 nova_compute[225855]: 2026-01-20 15:14:52.904 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:52.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:53.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.261 225859 DEBUG nova.network.neutron [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updated VIF entry in instance network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.262 225859 DEBUG nova.network.neutron [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.294 225859 DEBUG oslo_concurrency.lockutils [req-78436652-f690-4cd1-8fa2-b27b3fcd05c7 req-0b0acd81-bb75-4631-bf60-3fe37544e669 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.357 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:14:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:54Z|00828|binding|INFO|Releasing lport 1aa285ce-a9ae-4d1e-b4b9-c72f4e0b8d65 from this chassis (sb_readonly=0)
Jan 20 10:14:54 np0005588919 nova_compute[225855]: 2026-01-20 15:14:54.524 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:54.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:55.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:56.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:57 np0005588919 podman[303732]: 2026-01-20 15:14:57.01511448 +0000 UTC m=+0.052318825 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:14:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:57.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:57 np0005588919 nova_compute[225855]: 2026-01-20 15:14:57.342 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:57 np0005588919 nova_compute[225855]: 2026-01-20 15:14:57.542 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:57 np0005588919 nova_compute[225855]: 2026-01-20 15:14:57.907 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:14:58 np0005588919 nova_compute[225855]: 2026-01-20 15:14:58.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:14:58 np0005588919 nova_compute[225855]: 2026-01-20 15:14:58.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:14:58 np0005588919 nova_compute[225855]: 2026-01-20 15:14:58.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:14:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:14:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:14:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:14:58.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:14:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:14:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:14:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:14:59.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:14:59 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:59Z|00091|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.3
Jan 20 10:14:59 np0005588919 ovn_controller[130490]: 2026-01-20T15:14:59Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:6b:b0:3d 10.100.0.3
Jan 20 10:14:59 np0005588919 nova_compute[225855]: 2026-01-20 15:14:59.822 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:14:59 np0005588919 nova_compute[225855]: 2026-01-20 15:14:59.823 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:14:59 np0005588919 nova_compute[225855]: 2026-01-20 15:14:59.823 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:14:59 np0005588919 nova_compute[225855]: 2026-01-20 15:14:59.823 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 185fbaf7-4372-4e7c-b053-df9c4022514f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:00.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:01.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.459 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922087.4578772, 128af7d9-155f-468d-9873-98c816f0df9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.459 225859 INFO nova.compute.manager [-] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.545 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:02 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:02Z|00093|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.3
Jan 20 10:15:02 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:02Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:6b:b0:3d 10.100.0.3
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.707 225859 DEBUG nova.compute.manager [None req-96e89b3a-c676-48c2-b355-b8d154d8ef67 - - - - - -] [instance: 128af7d9-155f-468d-9873-98c816f0df9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.781 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:02 np0005588919 nova_compute[225855]: 2026-01-20 15:15:02.909 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:02.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 10:15:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:03.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 10:15:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.147 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.147 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.148 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.149 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.149 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:04 np0005588919 nova_compute[225855]: 2026-01-20 15:15:04.150 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:04Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:b0:3d 10.100.0.3
Jan 20 10:15:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:04Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:b0:3d 10.100.0.3
Jan 20 10:15:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:04.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.520 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.521 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.521 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.521 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.522 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3818984052' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:05 np0005588919 nova_compute[225855]: 2026-01-20 15:15:05.998 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.084 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.085 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bc as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.247 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.249 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4037MB free_disk=20.830204010009766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.250 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.250 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.472 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 185fbaf7-4372-4e7c-b053-df9c4022514f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.473 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.473 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.517 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:06 np0005588919 ceph-osd[79119]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 20 10:15:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3239820126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.954 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:06 np0005588919 nova_compute[225855]: 2026-01-20 15:15:06.960 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:15:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:06.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:07 np0005588919 nova_compute[225855]: 2026-01-20 15:15:07.009 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:15:07 np0005588919 nova_compute[225855]: 2026-01-20 15:15:07.042 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:15:07 np0005588919 nova_compute[225855]: 2026-01-20 15:15:07.043 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:07 np0005588919 nova_compute[225855]: 2026-01-20 15:15:07.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:07 np0005588919 nova_compute[225855]: 2026-01-20 15:15:07.911 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:08.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:09 np0005588919 nova_compute[225855]: 2026-01-20 15:15:09.234 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:09 np0005588919 nova_compute[225855]: 2026-01-20 15:15:09.235 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:09.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:10.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:12 np0005588919 nova_compute[225855]: 2026-01-20 15:15:12.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:12 np0005588919 nova_compute[225855]: 2026-01-20 15:15:12.914 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:13 np0005588919 podman[303855]: 2026-01-20 15:15:13.065762473 +0000 UTC m=+0.106264337 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:15:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:13.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:15.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:15.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:16.437 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:16.437 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:16.438 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:17.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:17.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:17 np0005588919 nova_compute[225855]: 2026-01-20 15:15:17.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:17 np0005588919 nova_compute[225855]: 2026-01-20 15:15:17.554 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:17 np0005588919 nova_compute[225855]: 2026-01-20 15:15:17.915 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:19.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:21.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:21.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:22 np0005588919 nova_compute[225855]: 2026-01-20 15:15:22.590 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:22 np0005588919 nova_compute[225855]: 2026-01-20 15:15:22.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:23.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:23 np0005588919 nova_compute[225855]: 2026-01-20 15:15:23.871 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:25.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:27.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:27.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.334 225859 DEBUG nova.compute.manager [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.453 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.454 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.494 225859 DEBUG nova.objects.instance [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_requests' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.541 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.541 225859 INFO nova.compute.claims [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.542 225859 DEBUG nova.objects.instance [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'resources' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.546 225859 DEBUG nova.compute.manager [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.546 225859 DEBUG nova.compute.manager [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing instance network info cache due to event network-changed-2cfaf09f-1f9e-489f-b7d3-43166c005796. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.547 225859 DEBUG oslo_concurrency.lockutils [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.547 225859 DEBUG oslo_concurrency.lockutils [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.547 225859 DEBUG nova.network.neutron [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Refreshing network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.562 225859 DEBUG nova.objects.instance [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.657 225859 INFO nova.compute.resource_tracker [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating resource usage from migration 57f22c5f-c3c6-4f11-afbc-5b3fc1752f60#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.658 225859 DEBUG nova.compute.resource_tracker [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Starting to track incoming migration 57f22c5f-c3c6-4f11-afbc-5b3fc1752f60 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.741 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.742 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.742 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.743 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.743 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.744 225859 INFO nova.compute.manager [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Terminating instance#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.745 225859 DEBUG nova.compute.manager [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:27 np0005588919 kernel: tap2cfaf09f-1f (unregistering): left promiscuous mode
Jan 20 10:15:27 np0005588919 NetworkManager[49104]: <info>  [1768922127.9345] device (tap2cfaf09f-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:15:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:27Z|00829|binding|INFO|Releasing lport 2cfaf09f-1f9e-489f-b7d3-43166c005796 from this chassis (sb_readonly=0)
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:27Z|00830|binding|INFO|Setting lport 2cfaf09f-1f9e-489f-b7d3-43166c005796 down in Southbound
Jan 20 10:15:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:27Z|00831|binding|INFO|Removing iface tap2cfaf09f-1f ovn-installed in OVS
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.969 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:27 np0005588919 nova_compute[225855]: 2026-01-20 15:15:27.996 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 20 10:15:28 np0005588919 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000bc.scope: Consumed 14.688s CPU time.
Jan 20 10:15:28 np0005588919 systemd-machined[194361]: Machine qemu-98-instance-000000bc terminated.
Jan 20 10:15:28 np0005588919 podman[303940]: 2026-01-20 15:15:28.042882936 +0000 UTC m=+0.089424528 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.053 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:b0:3d 10.100.0.3'], port_security=['fa:16:3e:6b:b0:3d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '185fbaf7-4372-4e7c-b053-df9c4022514f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a49638950e1543fa8e0d251af5479623', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c29da5ec-6cb2-4047-ba89-70fa67a96476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76ec1139-009f-49fe-bfde-07c0ef9e8b12, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2cfaf09f-1f9e-489f-b7d3-43166c005796) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.055 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2cfaf09f-1f9e-489f-b7d3-43166c005796 in datapath b677f1a9-dbaa-4373-8466-bd9ccf067b91 unbound from our chassis#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.057 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b677f1a9-dbaa-4373-8466-bd9ccf067b91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.058 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2230c423-ed9f-470f-a031-461398f19813]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.059 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 namespace which is not needed anymore#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.164 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.168 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [NOTICE]   (303542) : haproxy version is 2.8.14-c23fe91
Jan 20 10:15:28 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [NOTICE]   (303542) : path to executable is /usr/sbin/haproxy
Jan 20 10:15:28 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [WARNING]  (303542) : Exiting Master process...
Jan 20 10:15:28 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [ALERT]    (303542) : Current worker (303544) exited with code 143 (Terminated)
Jan 20 10:15:28 np0005588919 neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91[303538]: [WARNING]  (303542) : All workers exited. Exiting... (0)
Jan 20 10:15:28 np0005588919 systemd[1]: libpod-f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9.scope: Deactivated successfully.
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.185 225859 INFO nova.virt.libvirt.driver [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Instance destroyed successfully.#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.186 225859 DEBUG nova.objects.instance [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lazy-loading 'resources' on Instance uuid 185fbaf7-4372-4e7c-b053-df9c4022514f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:28 np0005588919 conmon[303538]: conmon f51b1e2ffd0c0523ee19 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9.scope/container/memory.events
Jan 20 10:15:28 np0005588919 podman[304002]: 2026-01-20 15:15:28.192857061 +0000 UTC m=+0.052283184 container died f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:15:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9-userdata-shm.mount: Deactivated successfully.
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.219 225859 DEBUG nova.virt.libvirt.vif [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-1033173342',display_name='tempest-TestVolumeBootPattern-server-1033173342',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-1033173342',id=188,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBjkv3PM31l7/LOeidHCDov4vvdGwqOT15IVVbWearXBCn3jQz2xB6ix8iz1XP+iiPXyhWuw0LpMPT9jQN2b0mvhqeZTHErGcz1VZLskRcT6iqcekmFxWykFxr44bv68XA==',key_name='tempest-TestVolumeBootPattern-474773317',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a49638950e1543fa8e0d251af5479623',ramdisk_id='',reservation_id='r-pz9crh7i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-194644003',owner_user_name='tempest-TestVolumeBootPattern-194644003-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:14:46Z,user_data=None,user_id='bf422e55e158420cbdae75f07a3bb97a',uuid=185fbaf7-4372-4e7c-b053-df9c4022514f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.219 225859 DEBUG nova.network.os_vif_util [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converting VIF {"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.220 225859 DEBUG nova.network.os_vif_util [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.221 225859 DEBUG os_vif [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.224 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.224 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cfaf09f-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay-67b20a5a2fd7710889925edaaefc37d2cd36bbde3ed5e1de89395074d35e2d88-merged.mount: Deactivated successfully.
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.228 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.231 225859 INFO os_vif [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=2cfaf09f-1f9e-489f-b7d3-43166c005796,network=Network(b677f1a9-dbaa-4373-8466-bd9ccf067b91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2cfaf09f-1f')#033[00m
Jan 20 10:15:28 np0005588919 podman[304002]: 2026-01-20 15:15:28.233138504 +0000 UTC m=+0.092564627 container cleanup f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:15:28 np0005588919 systemd[1]: libpod-conmon-f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9.scope: Deactivated successfully.
Jan 20 10:15:28 np0005588919 podman[304042]: 2026-01-20 15:15:28.28937295 +0000 UTC m=+0.038611177 container remove f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.295 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f96571-6de8-4d4d-95cc-1e3e12c633cf]: (4, ('Tue Jan 20 03:15:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9)\nf51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9\nTue Jan 20 03:15:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 (f51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9)\nf51b1e2ffd0c0523ee19615079e9e512b8c099800046323e66291635c3cc77d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.296 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[32d993c1-a136-429d-9eae-8b57ef159865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.297 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb677f1a9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 kernel: tapb677f1a9-d0: left promiscuous mode
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.318 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[80d8b198-b73d-41cb-a7e5-a50ec2fc94b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.341 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[da457d9e-244f-4855-8c00-8ef788afed6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.343 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cafabbb1-23c6-410c-859e-33ce23ba55fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.359 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a121d298-58b0-4158-8cde-2181705426f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 712354, 'reachable_time': 39672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304075, 'error': None, 'target': 'ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.361 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b677f1a9-dbaa-4373-8466-bd9ccf067b91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:15:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:28.362 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[c510100a-175f-44b4-9407-b2c06a13506a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:28 np0005588919 systemd[1]: run-netns-ovnmeta\x2db677f1a9\x2ddbaa\x2d4373\x2d8466\x2dbd9ccf067b91.mount: Deactivated successfully.
Jan 20 10:15:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3564192023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.430 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.438 225859 DEBUG nova.compute.provider_tree [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.457 225859 DEBUG nova.scheduler.client.report [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.476 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.477 225859 INFO nova.compute.manager [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Migrating#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.492 225859 INFO nova.virt.libvirt.driver [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Deleting instance files /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f_del#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.493 225859 INFO nova.virt.libvirt.driver [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Deletion of /var/lib/nova/instances/185fbaf7-4372-4e7c-b053-df9c4022514f_del complete#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.561 225859 INFO nova.compute.manager [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Took 0.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.562 225859 DEBUG oslo.service.loopingcall [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.562 225859 DEBUG nova.compute.manager [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:15:28 np0005588919 nova_compute[225855]: 2026-01-20 15:15:28.562 225859 DEBUG nova.network.neutron [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:15:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:29.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:29.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.680 225859 DEBUG nova.compute.manager [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-unplugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.681 225859 DEBUG oslo_concurrency.lockutils [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.681 225859 DEBUG oslo_concurrency.lockutils [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.681 225859 DEBUG oslo_concurrency.lockutils [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.681 225859 DEBUG nova.compute.manager [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] No waiting events found dispatching network-vif-unplugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:29 np0005588919 nova_compute[225855]: 2026-01-20 15:15:29.682 225859 DEBUG nova.compute.manager [req-a30b1e01-a015-4f15-b1c3-e3e7be1acc94 req-8c0c5996-d02e-4a33-ac08-ac0007df7acc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-unplugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:15:30 np0005588919 nova_compute[225855]: 2026-01-20 15:15:30.461 225859 DEBUG nova.network.neutron [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updated VIF entry in instance network info cache for port 2cfaf09f-1f9e-489f-b7d3-43166c005796. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:15:30 np0005588919 nova_compute[225855]: 2026-01-20 15:15:30.462 225859 DEBUG nova.network.neutron [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [{"id": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "address": "fa:16:3e:6b:b0:3d", "network": {"id": "b677f1a9-dbaa-4373-8466-bd9ccf067b91", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-408170906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a49638950e1543fa8e0d251af5479623", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2cfaf09f-1f", "ovs_interfaceid": "2cfaf09f-1f9e-489f-b7d3-43166c005796", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:30 np0005588919 nova_compute[225855]: 2026-01-20 15:15:30.481 225859 DEBUG oslo_concurrency.lockutils [req-a5758f26-8c43-4950-bed0-502a1521c662 req-ad93c421-dc8a-4d87-ab2b-38a939e3c537 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-185fbaf7-4372-4e7c-b053-df9c4022514f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:30 np0005588919 nova_compute[225855]: 2026-01-20 15:15:30.896 225859 DEBUG nova.network.neutron [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:30 np0005588919 nova_compute[225855]: 2026-01-20 15:15:30.932 225859 INFO nova.compute.manager [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Took 2.37 seconds to deallocate network for instance.#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.019 225859 DEBUG nova.compute.manager [req-1a4ac1b9-799b-490e-99dd-8771c3668b30 req-5fa23e5a-f663-4d70-9f86-d3b073bf09d5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-deleted-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:15:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:31.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:15:31 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 10:15:31 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 10:15:31 np0005588919 systemd-logind[783]: New session 69 of user nova.
Jan 20 10:15:31 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 10:15:31 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.205 225859 INFO nova.compute.manager [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Took 0.27 seconds to detach 1 volumes for instance.#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.253 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.253 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:31 np0005588919 systemd[304084]: Queued start job for default target Main User Target.
Jan 20 10:15:31 np0005588919 systemd[304084]: Created slice User Application Slice.
Jan 20 10:15:31 np0005588919 systemd[304084]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:15:31 np0005588919 systemd[304084]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 10:15:31 np0005588919 systemd[304084]: Reached target Paths.
Jan 20 10:15:31 np0005588919 systemd[304084]: Reached target Timers.
Jan 20 10:15:31 np0005588919 systemd[304084]: Starting D-Bus User Message Bus Socket...
Jan 20 10:15:31 np0005588919 systemd[304084]: Starting Create User's Volatile Files and Directories...
Jan 20 10:15:31 np0005588919 systemd[304084]: Finished Create User's Volatile Files and Directories.
Jan 20 10:15:31 np0005588919 systemd[304084]: Listening on D-Bus User Message Bus Socket.
Jan 20 10:15:31 np0005588919 systemd[304084]: Reached target Sockets.
Jan 20 10:15:31 np0005588919 systemd[304084]: Reached target Basic System.
Jan 20 10:15:31 np0005588919 systemd[304084]: Reached target Main User Target.
Jan 20 10:15:31 np0005588919 systemd[304084]: Startup finished in 138ms.
Jan 20 10:15:31 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 10:15:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:31.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:31 np0005588919 systemd[1]: Started Session 69 of User nova.
Jan 20 10:15:31 np0005588919 systemd[1]: session-69.scope: Deactivated successfully.
Jan 20 10:15:31 np0005588919 systemd-logind[783]: Session 69 logged out. Waiting for processes to exit.
Jan 20 10:15:31 np0005588919 systemd-logind[783]: Removed session 69.
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.433 225859 DEBUG oslo_concurrency.processutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:31 np0005588919 systemd-logind[783]: New session 71 of user nova.
Jan 20 10:15:31 np0005588919 systemd[1]: Started Session 71 of User nova.
Jan 20 10:15:31 np0005588919 systemd[1]: session-71.scope: Deactivated successfully.
Jan 20 10:15:31 np0005588919 systemd-logind[783]: Session 71 logged out. Waiting for processes to exit.
Jan 20 10:15:31 np0005588919 systemd-logind[783]: Removed session 71.
Jan 20 10:15:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:31 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1592438209' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.869 225859 DEBUG nova.compute.manager [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.870 225859 DEBUG oslo_concurrency.lockutils [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.870 225859 DEBUG oslo_concurrency.lockutils [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.870 225859 DEBUG oslo_concurrency.lockutils [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.871 225859 DEBUG nova.compute.manager [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] No waiting events found dispatching network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.871 225859 WARNING nova.compute.manager [req-920ca308-7218-4c6e-9fd8-7259f8d53d90 req-08b69dc9-7e07-40cf-8719-7a697d37c122 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Received unexpected event network-vif-plugged-2cfaf09f-1f9e-489f-b7d3-43166c005796 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.873 225859 DEBUG oslo_concurrency.processutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.879 225859 DEBUG nova.compute.provider_tree [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.922 225859 DEBUG nova.scheduler.client.report [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.943 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:31 np0005588919 nova_compute[225855]: 2026-01-20 15:15:31.966 225859 INFO nova.scheduler.client.report [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Deleted allocations for instance 185fbaf7-4372-4e7c-b053-df9c4022514f#033[00m
Jan 20 10:15:32 np0005588919 nova_compute[225855]: 2026-01-20 15:15:32.068 225859 DEBUG oslo_concurrency.lockutils [None req-22a2616d-74c4-4fc5-8de0-a4e568bbbb7a bf422e55e158420cbdae75f07a3bb97a a49638950e1543fa8e0d251af5479623 - - default default] Lock "185fbaf7-4372-4e7c-b053-df9c4022514f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:32 np0005588919 nova_compute[225855]: 2026-01-20 15:15:32.921 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:33.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:33 np0005588919 nova_compute[225855]: 2026-01-20 15:15:33.226 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:33.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.287 225859 DEBUG nova.compute.manager [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.288 225859 DEBUG oslo_concurrency.lockutils [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.288 225859 DEBUG oslo_concurrency.lockutils [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.288 225859 DEBUG oslo_concurrency.lockutils [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.288 225859 DEBUG nova.compute.manager [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.288 225859 WARNING nova.compute.manager [req-e4c946d2-8be3-403c-961a-3a75924fbe4e req-46ac77b8-0c6a-420a-bb6c-49d1a41ef030 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received unexpected event network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 10:15:34 np0005588919 nova_compute[225855]: 2026-01-20 15:15:34.637 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:35.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:35.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:35 np0005588919 nova_compute[225855]: 2026-01-20 15:15:35.819 225859 INFO nova.network.neutron [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating port 0e93d1de-671e-4e37-8e79-44bed7981254 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:15:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:37.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:37.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.541 225859 DEBUG nova.compute.manager [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.542 225859 DEBUG oslo_concurrency.lockutils [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.542 225859 DEBUG oslo_concurrency.lockutils [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.543 225859 DEBUG oslo_concurrency.lockutils [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.543 225859 DEBUG nova.compute.manager [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.543 225859 WARNING nova.compute.manager [req-1000111b-15d0-45d2-ac83-d49fd1cbbc37 req-33a57693-0eff-4d7f-8520-32926828c084 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received unexpected event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 10:15:37 np0005588919 nova_compute[225855]: 2026-01-20 15:15:37.923 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.076 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.077 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquired lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.077 225859 DEBUG nova.network.neutron [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.228 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.296 225859 DEBUG nova.compute.manager [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.297 225859 DEBUG nova.compute.manager [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing instance network info cache due to event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:15:38 np0005588919 nova_compute[225855]: 2026-01-20 15:15:38.297 225859 DEBUG oslo_concurrency.lockutils [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:39.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:39.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:39 np0005588919 nova_compute[225855]: 2026-01-20 15:15:39.893 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:39 np0005588919 nova_compute[225855]: 2026-01-20 15:15:39.894 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:39 np0005588919 nova_compute[225855]: 2026-01-20 15:15:39.923 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.021 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.022 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.031 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.031 225859 INFO nova.compute.claims [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.171 225859 DEBUG nova.network.neutron [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.193 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Releasing lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.197 225859 DEBUG oslo_concurrency.lockutils [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.197 225859 DEBUG nova.network.neutron [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.263 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.293 225859 DEBUG os_brick.utils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.294 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.306 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.306 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4ddb58-46db-4a9a-b09d-13bc6c7c15e1]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.308 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.315 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.315 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[05f87087-9d58-45bc-bcb7-d528b0636940]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.317 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.324 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.324 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ea9234-5873-4e73-a1ff-c871fe1322c2]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.325 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[2dbdb80e-a84a-47ae-90f3-88020214fa9e]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.326 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.353 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.355 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.355 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.356 225859 DEBUG os_brick.initiator.connectors.lightos [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.356 225859 DEBUG os_brick.utils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] <== get_connector_properties: return (61ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:15:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:15:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1691770121' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.701 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.708 225859 DEBUG nova.compute.provider_tree [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.730 225859 DEBUG nova.scheduler.client.report [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.755 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.756 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.808 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.809 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.832 225859 INFO nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.854 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.975 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.976 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:15:40 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.976 225859 INFO nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Creating image(s)#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:40.999 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.023 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:41.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.047 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.050 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.077 225859 DEBUG nova.policy [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.113 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.114 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.114 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.115 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.138 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.142 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b5656c1b-5ac7-4b93-a25d-420e1e294678_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:41.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.410 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 b5656c1b-5ac7-4b93-a25d-420e1e294678_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.268s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.491 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.608 225859 DEBUG nova.objects.instance [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid b5656c1b-5ac7-4b93-a25d-420e1e294678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.633 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.634 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Ensure instance console log exists: /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.635 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.635 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:41 np0005588919 nova_compute[225855]: 2026-01-20 15:15:41.635 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:41 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 10:15:41 np0005588919 systemd[304084]: Activating special unit Exit the Session...
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped target Main User Target.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped target Basic System.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped target Paths.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped target Sockets.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped target Timers.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 10:15:41 np0005588919 systemd[304084]: Closed D-Bus User Message Bus Socket.
Jan 20 10:15:41 np0005588919 systemd[304084]: Stopped Create User's Volatile Files and Directories.
Jan 20 10:15:41 np0005588919 systemd[304084]: Removed slice User Application Slice.
Jan 20 10:15:41 np0005588919 systemd[304084]: Reached target Shutdown.
Jan 20 10:15:41 np0005588919 systemd[304084]: Finished Exit the Session.
Jan 20 10:15:41 np0005588919 systemd[304084]: Reached target Exit the Session.
Jan 20 10:15:41 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 10:15:41 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 10:15:41 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 10:15:41 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 10:15:41 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 10:15:41 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 10:15:41 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.058 225859 DEBUG nova.network.neutron [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updated VIF entry in instance network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.059 225859 DEBUG nova.network.neutron [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.074 225859 DEBUG oslo_concurrency.lockutils [req-87897e1d-ecd1-4510-b9de-886f9ed8c75b req-f4ce1150-6021-408f-8ead-d1dfd55983da 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.273 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.274 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.275 225859 INFO nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Creating image(s)#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.303 225859 DEBUG nova.storage.rbd_utils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] creating snapshot(nova-resize) on rbd image(f1ded131-d9a3-4e93-ad99-53ee2695d5c8_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.576 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Successfully created port: ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:15:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.699 225859 DEBUG nova.objects.instance [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.809 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.809 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Ensure instance console log exists: /var/lib/nova/instances/f1ded131-d9a3-4e93-ad99-53ee2695d5c8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.810 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.810 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.810 225859 DEBUG oslo_concurrency.lockutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.813 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Start _get_guest_xml network_info=[{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:99:5e:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': False, 'device_type': 'disk', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-933c5c7a-f496-4bcc-b304-68156c235fe5', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '933c5c7a-f496-4bcc-b304-68156c235fe5', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': 'f1ded131-d9a3-4e93-ad99-53ee2695d5c8', 'attached_at': '2026-01-20T15:15:41.000000', 'detached_at': '', 'volume_id': '933c5c7a-f496-4bcc-b304-68156c235fe5', 'multiattach': True, 'serial': '933c5c7a-f496-4bcc-b304-68156c235fe5'}, 'guest_format': None, 'boot_index': None, 'mount_device': '/dev/vdb', 'attachment_id': '2d78d259-3019-4a60-a542-4278ca487610', 'disk_bus': 'virtio', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.817 225859 WARNING nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.820 225859 DEBUG nova.virt.libvirt.host [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.821 225859 DEBUG nova.virt.libvirt.host [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.827 225859 DEBUG nova.virt.libvirt.host [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.829 225859 DEBUG nova.virt.libvirt.host [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.830 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.830 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.831 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.831 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.831 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.831 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.832 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.832 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.832 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.832 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.833 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.833 225859 DEBUG nova.virt.hardware [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.833 225859 DEBUG nova.objects.instance [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.864 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:42 np0005588919 nova_compute[225855]: 2026-01-20 15:15:42.925 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:43.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.183 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922128.182088, 185fbaf7-4372-4e7c-b053-df9c4022514f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.183 225859 INFO nova.compute.manager [-] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.226 225859 DEBUG nova.compute.manager [None req-d3b5c919-5fc0-4576-85c1-b1c081b9ec73 - - - - - -] [instance: 185fbaf7-4372-4e7c-b053-df9c4022514f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.229 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2943588033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.296 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.324 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:43.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e403 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1384480633' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.736 225859 DEBUG oslo_concurrency.processutils [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.780 225859 DEBUG nova.virt.libvirt.vif [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=187,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-h927541v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:15:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=f1ded131-d9a3-4e93-ad99-53ee2695d5c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:99:5e:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.780 225859 DEBUG nova.network.os_vif_util [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:99:5e:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.781 225859 DEBUG nova.network.os_vif_util [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.783 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <uuid>f1ded131-d9a3-4e93-ad99-53ee2695d5c8</uuid>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <name>instance-000000bb</name>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:name>multiattach-server-0</nova:name>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:15:42</nova:creationTime>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:user uuid="e9cc4ce3e069479ba9c789b378a68a1d">tempest-AttachVolumeMultiAttachTest-418194625-project-member</nova:user>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:project uuid="fff727019f86407498e83d7948d54962">tempest-AttachVolumeMultiAttachTest-418194625</nova:project>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <nova:port uuid="0e93d1de-671e-4e37-8e79-44bed7981254">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="serial">f1ded131-d9a3-4e93-ad99-53ee2695d5c8</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="uuid">f1ded131-d9a3-4e93-ad99-53ee2695d5c8</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f1ded131-d9a3-4e93-ad99-53ee2695d5c8_disk">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/f1ded131-d9a3-4e93-ad99-53ee2695d5c8_disk.config">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="volumes/volume-933c5c7a-f496-4bcc-b304-68156c235fe5">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <target dev="vdb" bus="virtio"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <serial>933c5c7a-f496-4bcc-b304-68156c235fe5</serial>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <shareable/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:99:5e:ed"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <target dev="tap0e93d1de-67"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/f1ded131-d9a3-4e93-ad99-53ee2695d5c8/console.log" append="off"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:15:43 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:15:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:15:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:15:43 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.785 225859 DEBUG nova.virt.libvirt.vif [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=187,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:14:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-h927541v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:15:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=f1ded131-d9a3-4e93-ad99-53ee2695d5c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:99:5e:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.785 225859 DEBUG nova.network.os_vif_util [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "vif_mac": "fa:16:3e:99:5e:ed"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.785 225859 DEBUG nova.network.os_vif_util [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.786 225859 DEBUG os_vif [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.786 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.787 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.787 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.790 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e93d1de-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.790 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e93d1de-67, col_values=(('external_ids', {'iface-id': '0e93d1de-671e-4e37-8e79-44bed7981254', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:5e:ed', 'vm-uuid': 'f1ded131-d9a3-4e93-ad99-53ee2695d5c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:43 np0005588919 NetworkManager[49104]: <info>  [1768922143.7930] manager: (tap0e93d1de-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.794 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.798 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.799 225859 INFO os_vif [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67')#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.885 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.885 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.885 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.886 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] No VIF found with MAC fa:16:3e:99:5e:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.886 225859 INFO nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Using config drive#033[00m
Jan 20 10:15:43 np0005588919 podman[304518]: 2026-01-20 15:15:43.910843835 +0000 UTC m=+0.077927732 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 20 10:15:43 np0005588919 kernel: tap0e93d1de-67: entered promiscuous mode
Jan 20 10:15:43 np0005588919 NetworkManager[49104]: <info>  [1768922143.9633] manager: (tap0e93d1de-67): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Jan 20 10:15:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:43Z|00832|binding|INFO|Claiming lport 0e93d1de-671e-4e37-8e79-44bed7981254 for this chassis.
Jan 20 10:15:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:43Z|00833|binding|INFO|0e93d1de-671e-4e37-8e79-44bed7981254: Claiming fa:16:3e:99:5e:ed 10.100.0.3
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.963 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.969 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:5e:ed 10.100.0.3'], port_security=['fa:16:3e:99:5e:ed 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1ded131-d9a3-4e93-ad99-53ee2695d5c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=0e93d1de-671e-4e37-8e79-44bed7981254) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.971 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 0e93d1de-671e-4e37-8e79-44bed7981254 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab bound to our chassis#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.972 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab#033[00m
Jan 20 10:15:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:43Z|00834|binding|INFO|Setting lport 0e93d1de-671e-4e37-8e79-44bed7981254 ovn-installed in OVS
Jan 20 10:15:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:43Z|00835|binding|INFO|Setting lport 0e93d1de-671e-4e37-8e79-44bed7981254 up in Southbound
Jan 20 10:15:43 np0005588919 nova_compute[225855]: 2026-01-20 15:15:43.982 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.984 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cee19937-a8d1-4125-8148-5570c394880b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.985 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1f4a971-01 in ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.987 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1f4a971-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.987 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9272b8d5-9544-4c5c-bce2-f47c29c01b82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.987 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c1894f2b-b0c9-489c-99b6-68e01dc0c47a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:43 np0005588919 systemd-udevd[304576]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:15:43 np0005588919 systemd-machined[194361]: New machine qemu-99-instance-000000bb.
Jan 20 10:15:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:43.997 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8e898592-c06b-481a-a529-ff7141035c1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 NetworkManager[49104]: <info>  [1768922144.0072] device (tap0e93d1de-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:15:44 np0005588919 NetworkManager[49104]: <info>  [1768922144.0077] device (tap0e93d1de-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:15:44 np0005588919 systemd[1]: Started Virtual Machine qemu-99-instance-000000bb.
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.010 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2f34a5-a482-40bf-9b34-6b5a8928adc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.034 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2463f21c-5c83-49f3-8b1f-dcf9eaf46e61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.040 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[95fe8f50-e685-4e84-8154-cb1842bb5ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 NetworkManager[49104]: <info>  [1768922144.0414] manager: (tapc1f4a971-00): new Veth device (/org/freedesktop/NetworkManager/Devices/348)
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.067 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[35dc046f-8634-4a6b-9cd0-7b2e146cc2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.070 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1967d06c-4fa7-4e46-82d7-a603a08a5b58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 NetworkManager[49104]: <info>  [1768922144.0906] device (tapc1f4a971-00): carrier: link connected
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.094 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c52fb6-ef6d-43a4-97d2-2382fc3c919c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.111 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2d5c59-72bb-43b7-ac53-8b0f4e42b034]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718301, 'reachable_time': 36622, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304608, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.127 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[08a4688f-fdc2-4c84-a25f-e7f4b8a4d7cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:30f0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718301, 'tstamp': 718301}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304609, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.144 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7f2875-ce1d-45a5-a655-e0d9f09fd4dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1f4a971-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:30:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718301, 'reachable_time': 36622, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304610, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.177 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb85b8e-b912-4610-8145-4a186458c8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.239 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a149067e-08d1-4fba-99e5-364c93539071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.241 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.241 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.241 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1f4a971-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588919 NetworkManager[49104]: <info>  [1768922144.2441] manager: (tapc1f4a971-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 20 10:15:44 np0005588919 kernel: tapc1f4a971-00: entered promiscuous mode
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.246 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.249 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1f4a971-00, col_values=(('external_ids', {'iface-id': 'b20b0e27-0b08-4316-b6df-6784416f44c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:44Z|00836|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.250 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.254 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Successfully updated port: ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.257 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.258 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[00c03140-d17f-4ca2-acd2-143d5eba8e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.259 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.pid.haproxy
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:15:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:44.259 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'env', 'PROCESS_TAG=haproxy-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.277 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.277 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.278 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.467 225859 DEBUG nova.compute.manager [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.467 225859 DEBUG nova.compute.manager [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing instance network info cache due to event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.468 225859 DEBUG oslo_concurrency.lockutils [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.568 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:15:44 np0005588919 podman[304692]: 2026-01-20 15:15:44.616147967 +0000 UTC m=+0.050589987 container create a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 20 10:15:44 np0005588919 systemd[1]: Started libpod-conmon-a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259.scope.
Jan 20 10:15:44 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:15:44 np0005588919 podman[304692]: 2026-01-20 15:15:44.586921437 +0000 UTC m=+0.021363477 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:15:44 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb008d1ba0088a00e33db5fdd52a317528e9420e9947ad1e20b1f6e8e7235013/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.695 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922144.694481, f1ded131-d9a3-4e93-ad99-53ee2695d5c8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.695 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.698 225859 DEBUG nova.compute.manager [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.701 225859 INFO nova.virt.libvirt.driver [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Instance running successfully.#033[00m
Jan 20 10:15:44 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:15:44 np0005588919 podman[304692]: 2026-01-20 15:15:44.703783823 +0000 UTC m=+0.138225873 container init a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.704 225859 DEBUG nova.virt.libvirt.guest [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.704 225859 DEBUG nova.virt.libvirt.driver [None req-b335d546-cb29-41df-a4fc-7fc39a94104e e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 10:15:44 np0005588919 podman[304692]: 2026-01-20 15:15:44.710956227 +0000 UTC m=+0.145398247 container start a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.726 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.729 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:15:44 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [NOTICE]   (304722) : New worker (304724) forked
Jan 20 10:15:44 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [NOTICE]   (304722) : Loading success.
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.775 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.775 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922144.695603, f1ded131-d9a3-4e93-ad99-53ee2695d5c8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.776 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] VM Started (Lifecycle Event)#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.835 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.839 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:15:44 np0005588919 nova_compute[225855]: 2026-01-20 15:15:44.873 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 10:15:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:45.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:45.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.646 225859 DEBUG nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.646 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.647 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.647 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.647 225859 DEBUG nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.648 225859 WARNING nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received unexpected event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.648 225859 DEBUG nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.648 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.648 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.649 225859 DEBUG oslo_concurrency.lockutils [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.649 225859 DEBUG nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.649 225859 WARNING nova.compute.manager [req-4e636554-da77-4e96-a836-a9a962da2e40 req-cf8118bb-0a23-4f14-878f-090c53e009ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received unexpected event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.679 225859 DEBUG nova.network.neutron [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updating instance_info_cache with network_info: [{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.708 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.709 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Instance network_info: |[{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.709 225859 DEBUG oslo_concurrency.lockutils [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.709 225859 DEBUG nova.network.neutron [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.712 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Start _get_guest_xml network_info=[{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.715 225859 WARNING nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.720 225859 DEBUG nova.virt.libvirt.host [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.720 225859 DEBUG nova.virt.libvirt.host [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.724 225859 DEBUG nova.virt.libvirt.host [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.725 225859 DEBUG nova.virt.libvirt.host [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.726 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.726 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.727 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.727 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.727 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.727 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.728 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.728 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.728 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.728 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.729 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.729 225859 DEBUG nova.virt.hardware [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:15:46 np0005588919 nova_compute[225855]: 2026-01-20 15:15:46.731 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 20 10:15:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:47.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3505090184' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.199 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.225 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.229 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:47.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:15:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/257267576' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.682 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.685 225859 DEBUG nova.virt.libvirt.vif [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-470752205',display_name='tempest-TestNetworkAdvancedServerOps-server-470752205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-470752205',id=190,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAmgIhRv7SylyyDoXsoPKcXNSjcMbMCF7/IROo74ZCmTo9LbWE2Sanv271vjV+ounImSggkddfPFTxsQsqOAeGJ63UOB1CVRLYAEgvPLI8ngnO4k9hlNWAKjL0F7Yejx9A==',key_name='tempest-TestNetworkAdvancedServerOps-131135683',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-kjhk913w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:15:40Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=b5656c1b-5ac7-4b93-a25d-420e1e294678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.686 225859 DEBUG nova.network.os_vif_util [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.687 225859 DEBUG nova.network.os_vif_util [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.689 225859 DEBUG nova.objects.instance [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid b5656c1b-5ac7-4b93-a25d-420e1e294678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.728 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <uuid>b5656c1b-5ac7-4b93-a25d-420e1e294678</uuid>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <name>instance-000000be</name>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-470752205</nova:name>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:15:46</nova:creationTime>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <nova:port uuid="ebbe6083-de9d-43ca-9ab2-cf306ea0be4d">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="serial">b5656c1b-5ac7-4b93-a25d-420e1e294678</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="uuid">b5656c1b-5ac7-4b93-a25d-420e1e294678</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b5656c1b-5ac7-4b93-a25d-420e1e294678_disk">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:a9:77:ea"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <target dev="tapebbe6083-de"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/console.log" append="off"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:15:47 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:15:47 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:15:47 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:15:47 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.731 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Preparing to wait for external event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.731 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.732 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.732 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.733 225859 DEBUG nova.virt.libvirt.vif [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-470752205',display_name='tempest-TestNetworkAdvancedServerOps-server-470752205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-470752205',id=190,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAmgIhRv7SylyyDoXsoPKcXNSjcMbMCF7/IROo74ZCmTo9LbWE2Sanv271vjV+ounImSggkddfPFTxsQsqOAeGJ63UOB1CVRLYAEgvPLI8ngnO4k9hlNWAKjL0F7Yejx9A==',key_name='tempest-TestNetworkAdvancedServerOps-131135683',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-kjhk913w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:15:40Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=b5656c1b-5ac7-4b93-a25d-420e1e294678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.734 225859 DEBUG nova.network.os_vif_util [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.735 225859 DEBUG nova.network.os_vif_util [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.736 225859 DEBUG os_vif [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.737 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.738 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.742 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.743 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebbe6083-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.743 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebbe6083-de, col_values=(('external_ids', {'iface-id': 'ebbe6083-de9d-43ca-9ab2-cf306ea0be4d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:77:ea', 'vm-uuid': 'b5656c1b-5ac7-4b93-a25d-420e1e294678'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.746 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:47 np0005588919 NetworkManager[49104]: <info>  [1768922147.7466] manager: (tapebbe6083-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.754 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.755 225859 INFO os_vif [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de')#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.827 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.827 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.828 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:a9:77:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.828 225859 INFO nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Using config drive#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.854 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:47 np0005588919 nova_compute[225855]: 2026-01-20 15:15:47.926 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.162 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:48.162 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:15:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:48.163 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:15:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.598 225859 INFO nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Creating config drive at /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.603 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppumrw33v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.734 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppumrw33v" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.761 225859 DEBUG nova.storage.rbd_utils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.765 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.934 225859 DEBUG oslo_concurrency.processutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config b5656c1b-5ac7-4b93-a25d-420e1e294678_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.169s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.936 225859 INFO nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Deleting local config drive /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678/disk.config because it was imported into RBD.#033[00m
Jan 20 10:15:48 np0005588919 kernel: tapebbe6083-de: entered promiscuous mode
Jan 20 10:15:48 np0005588919 NetworkManager[49104]: <info>  [1768922148.9774] manager: (tapebbe6083-de): new Tun device (/org/freedesktop/NetworkManager/Devices/351)
Jan 20 10:15:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:48Z|00837|binding|INFO|Claiming lport ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for this chassis.
Jan 20 10:15:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:48Z|00838|binding|INFO|ebbe6083-de9d-43ca-9ab2-cf306ea0be4d: Claiming fa:16:3e:a9:77:ea 10.100.0.5
Jan 20 10:15:48 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:48.990 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:77:ea 10.100.0.5'], port_security=['fa:16:3e:a9:77:ea 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b5656c1b-5ac7-4b93-a25d-420e1e294678', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be008398-8f36-4967-9cc8-6412553c79f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1f7c21ab-d630-47d9-a822-01d8ee3b1d55', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cdafc2c8-f418-454c-b49a-dbb24d8d2298, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:15:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:48.991 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d in datapath be008398-8f36-4967-9cc8-6412553c79f3 bound to our chassis#033[00m
Jan 20 10:15:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:48.994 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network be008398-8f36-4967-9cc8-6412553c79f3#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:48.999 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:49Z|00839|binding|INFO|Setting lport ebbe6083-de9d-43ca-9ab2-cf306ea0be4d ovn-installed in OVS
Jan 20 10:15:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:49Z|00840|binding|INFO|Setting lport ebbe6083-de9d-43ca-9ab2-cf306ea0be4d up in Southbound
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.008 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a52f33eb-93c2-445f-b6e4-762ec4468701]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.009 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbe008398-81 in ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.011 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbe008398-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.011 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c64956f4-a7ae-4a56-98e6-3adb7873b01c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.012 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b48f718-0e90-4d80-896a-4e2fd911541b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 systemd-udevd[304872]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.022 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d90ee5-0099-45ad-b478-e7800d15ae01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 systemd-machined[194361]: New machine qemu-100-instance-000000be.
Jan 20 10:15:49 np0005588919 NetworkManager[49104]: <info>  [1768922149.0353] device (tapebbe6083-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:15:49 np0005588919 NetworkManager[49104]: <info>  [1768922149.0365] device (tapebbe6083-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:15:49 np0005588919 systemd[1]: Started Virtual Machine qemu-100-instance-000000be.
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.044 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[387b4d8a-cd5f-4d6d-952a-6356619b6d11]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:49.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.078 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f063d9-8ede-481f-a208-97467f7b8f12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 NetworkManager[49104]: <info>  [1768922149.0848] manager: (tapbe008398-80): new Veth device (/org/freedesktop/NetworkManager/Devices/352)
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.083 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6c69ff28-eff0-462d-af33-833de441e937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.125 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[46f42e1a-0b1e-4364-b848-71db05860a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.128 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[adee28e8-acc1-4e95-934e-e8a0b0f2797e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 NetworkManager[49104]: <info>  [1768922149.1532] device (tapbe008398-80): carrier: link connected
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.158 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[94df023d-0229-463d-8499-eb7c6307cbe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.173 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[087ceea3-6986-4012-9e24-005c061b2803]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe008398-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:1d:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718808, 'reachable_time': 42207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304904, 'error': None, 'target': 'ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.188 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b0d97ac-c75c-49fb-928f-21fdb45805bc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:1d8a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718808, 'tstamp': 718808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304905, 'error': None, 'target': 'ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.204 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f10b351b-3f51-446f-9fdb-471769d963e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe008398-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:1d:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718808, 'reachable_time': 42207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304906, 'error': None, 'target': 'ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.233 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d820fd-e2ac-4723-80c0-698dda733e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.297 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[516ee1b3-8b60-41e3-a796-7c03c2e939c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.298 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe008398-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.299 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.299 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe008398-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 NetworkManager[49104]: <info>  [1768922149.3033] manager: (tapbe008398-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 20 10:15:49 np0005588919 kernel: tapbe008398-80: entered promiscuous mode
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.305 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbe008398-80, col_values=(('external_ids', {'iface-id': 'f3fd8b5d-b152-40f2-b571-88de4b49c77e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:49Z|00841|binding|INFO|Releasing lport f3fd8b5d-b152-40f2-b571-88de4b49c77e from this chassis (sb_readonly=0)
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.309 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/be008398-8f36-4967-9cc8-6412553c79f3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/be008398-8f36-4967-9cc8-6412553c79f3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.310 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[28ea90c1-939b-48f2-86d0-4860afc1ff27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.311 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-be008398-8f36-4967-9cc8-6412553c79f3
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/be008398-8f36-4967-9cc8-6412553c79f3.pid.haproxy
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID be008398-8f36-4967-9cc8-6412553c79f3
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:15:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:49.311 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3', 'env', 'PROCESS_TAG=haproxy-be008398-8f36-4967-9cc8-6412553c79f3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/be008398-8f36-4967-9cc8-6412553c79f3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.322 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:49.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.456 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922149.4554558, b5656c1b-5ac7-4b93-a25d-420e1e294678 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.457 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] VM Started (Lifecycle Event)#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.476 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.481 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922149.4555695, b5656c1b-5ac7-4b93-a25d-420e1e294678 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.481 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.507 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.511 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.531 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.759 225859 DEBUG nova.network.neutron [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updated VIF entry in instance network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.760 225859 DEBUG nova.network.neutron [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updating instance_info_cache with network_info: [{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:15:49 np0005588919 podman[305081]: 2026-01-20 15:15:49.772695467 +0000 UTC m=+0.072191410 container create ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 20 10:15:49 np0005588919 nova_compute[225855]: 2026-01-20 15:15:49.779 225859 DEBUG oslo_concurrency.lockutils [req-077d4813-c385-4e23-9c9b-e6f2eae66aed req-b7052b2a-aba9-4bfd-96f1-6621372e8d7c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:15:49 np0005588919 systemd[1]: Started libpod-conmon-ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f.scope.
Jan 20 10:15:49 np0005588919 podman[305081]: 2026-01-20 15:15:49.738560818 +0000 UTC m=+0.038056791 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:15:49 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:15:49 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62df8bc671769c9f3b38059ef802cd2fd43ec5ee8411e92bf650967517928ab7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:15:49 np0005588919 podman[305081]: 2026-01-20 15:15:49.88172103 +0000 UTC m=+0.181216993 container init ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:15:49 np0005588919 podman[305081]: 2026-01-20 15:15:49.889438499 +0000 UTC m=+0.188934442 container start ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:15:49 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [NOTICE]   (305112) : New worker (305116) forked
Jan 20 10:15:49 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [NOTICE]   (305112) : Loading success.
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.246 225859 DEBUG nova.compute.manager [req-1fca2a33-5a1b-4c85-9f99-aac2b77e1378 req-f60cf546-c7ab-444d-95eb-ff92e70a6215 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.246 225859 DEBUG oslo_concurrency.lockutils [req-1fca2a33-5a1b-4c85-9f99-aac2b77e1378 req-f60cf546-c7ab-444d-95eb-ff92e70a6215 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.247 225859 DEBUG oslo_concurrency.lockutils [req-1fca2a33-5a1b-4c85-9f99-aac2b77e1378 req-f60cf546-c7ab-444d-95eb-ff92e70a6215 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.247 225859 DEBUG oslo_concurrency.lockutils [req-1fca2a33-5a1b-4c85-9f99-aac2b77e1378 req-f60cf546-c7ab-444d-95eb-ff92e70a6215 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.247 225859 DEBUG nova.compute.manager [req-1fca2a33-5a1b-4c85-9f99-aac2b77e1378 req-f60cf546-c7ab-444d-95eb-ff92e70a6215 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Processing event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.248 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.253 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922150.2529113, b5656c1b-5ac7-4b93-a25d-420e1e294678 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.253 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.255 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.259 225859 INFO nova.virt.libvirt.driver [-] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Instance spawned successfully.#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.260 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.275 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.278 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.286 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.286 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.287 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.287 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.288 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.288 225859 DEBUG nova.virt.libvirt.driver [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.370 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:15:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:15:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.437 225859 INFO nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Took 9.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.438 225859 DEBUG nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.494 225859 INFO nova.compute.manager [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Took 10.50 seconds to build instance.#033[00m
Jan 20 10:15:50 np0005588919 nova_compute[225855]: 2026-01-20 15:15:50.522 225859 DEBUG oslo_concurrency.lockutils [None req-bbd8c924-2ea1-4107-b500-c19534f19b83 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:51.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:51.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.473 225859 DEBUG nova.compute.manager [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.474 225859 DEBUG oslo_concurrency.lockutils [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.474 225859 DEBUG oslo_concurrency.lockutils [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.474 225859 DEBUG oslo_concurrency.lockutils [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.474 225859 DEBUG nova.compute.manager [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.475 225859 WARNING nova.compute.manager [req-129a25d7-9b39-4079-8a28-fcb383acee96 req-7db61976-0aa0-43d1-ba1b-60ed67796bb3 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state None.#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:52 np0005588919 nova_compute[225855]: 2026-01-20 15:15:52.931 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:53.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:53.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:53Z|00842|binding|INFO|Releasing lport f3fd8b5d-b152-40f2-b571-88de4b49c77e from this chassis (sb_readonly=0)
Jan 20 10:15:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:53Z|00843|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:15:53 np0005588919 nova_compute[225855]: 2026-01-20 15:15:53.655 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 20 10:15:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:15:54.165 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:15:54 np0005588919 nova_compute[225855]: 2026-01-20 15:15:54.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:54 np0005588919 nova_compute[225855]: 2026-01-20 15:15:54.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:15:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:15:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:55.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:15:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:55.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:56 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 20 10:15:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.067 225859 DEBUG nova.compute.manager [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.067 225859 DEBUG nova.compute.manager [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing instance network info cache due to event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.068 225859 DEBUG oslo_concurrency.lockutils [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.068 225859 DEBUG oslo_concurrency.lockutils [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.068 225859 DEBUG nova.network.neutron [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:15:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:57.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:15:57Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:5e:ed 10.100.0.3
Jan 20 10:15:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:15:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:57.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.750 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:57 np0005588919 nova_compute[225855]: 2026-01-20 15:15:57.957 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:15:58 np0005588919 nova_compute[225855]: 2026-01-20 15:15:58.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e405 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:15:59 np0005588919 podman[305193]: 2026-01-20 15:15:59.028196337 +0000 UTC m=+0.067241329 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:15:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:15:59.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:15:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:15:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:15:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:15:59.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.762 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.763 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.763 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:15:59 np0005588919 nova_compute[225855]: 2026-01-20 15:15:59.763 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:16:00 np0005588919 nova_compute[225855]: 2026-01-20 15:16:00.905 225859 DEBUG nova.network.neutron [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updated VIF entry in instance network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:00 np0005588919 nova_compute[225855]: 2026-01-20 15:16:00.908 225859 DEBUG nova.network.neutron [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updating instance_info_cache with network_info: [{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:00 np0005588919 nova_compute[225855]: 2026-01-20 15:16:00.947 225859 DEBUG oslo_concurrency.lockutils [req-0686ac1a-8332-48ba-8d0c-3e4a33aa6e0f req-383c81a0-ff9a-47f4-8564-e63c5d0dd20c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:01.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:01.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 20 10:16:01 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:01Z|00844|binding|INFO|Releasing lport f3fd8b5d-b152-40f2-b571-88de4b49c77e from this chassis (sb_readonly=0)
Jan 20 10:16:01 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:01Z|00845|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.413 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.482 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.483 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.483 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.539 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.540 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.540 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.541 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.541 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.752 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:02 np0005588919 nova_compute[225855]: 2026-01-20 15:16:02.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:03.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2411011844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.158 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.250 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.251 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.251 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.254 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000be as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.254 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000be as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:03.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.403 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.404 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3873MB free_disk=20.78484344482422GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.404 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.404 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.527 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.527 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance b5656c1b-5ac7-4b93-a25d-420e1e294678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.527 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.528 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.538 225859 DEBUG nova.compute.manager [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.539 225859 DEBUG nova.compute.manager [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing instance network info cache due to event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.539 225859 DEBUG oslo_concurrency.lockutils [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.539 225859 DEBUG oslo_concurrency.lockutils [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.539 225859 DEBUG nova.network.neutron [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:16:03 np0005588919 nova_compute[225855]: 2026-01-20 15:16:03.804 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4156676619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:04 np0005588919 nova_compute[225855]: 2026-01-20 15:16:04.290 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:04 np0005588919 nova_compute[225855]: 2026-01-20 15:16:04.298 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:16:04 np0005588919 nova_compute[225855]: 2026-01-20 15:16:04.335 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:16:04 np0005588919 nova_compute[225855]: 2026-01-20 15:16:04.390 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:16:04 np0005588919 nova_compute[225855]: 2026-01-20 15:16:04.390 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:05.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:05Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:77:ea 10.100.0.5
Jan 20 10:16:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:05Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:77:ea 10.100.0.5
Jan 20 10:16:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:05.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:07.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:07.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:07 np0005588919 nova_compute[225855]: 2026-01-20 15:16:07.590 225859 DEBUG nova.network.neutron [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updated VIF entry in instance network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:07 np0005588919 nova_compute[225855]: 2026-01-20 15:16:07.591 225859 DEBUG nova.network.neutron [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:07 np0005588919 nova_compute[225855]: 2026-01-20 15:16:07.738 225859 DEBUG oslo_concurrency.lockutils [req-d35e570b-3627-4498-80c7-1f9ae9e1c065 req-fbe58f85-a193-4be0-b7d3-31164e136147 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:07 np0005588919 nova_compute[225855]: 2026-01-20 15:16:07.756 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:07 np0005588919 nova_compute[225855]: 2026-01-20 15:16:07.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:08 np0005588919 nova_compute[225855]: 2026-01-20 15:16:08.247 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:09.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:09.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:10 np0005588919 nova_compute[225855]: 2026-01-20 15:16:10.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:11.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:11.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:11 np0005588919 nova_compute[225855]: 2026-01-20 15:16:11.902 225859 INFO nova.compute.manager [None req-b1256381-e841-4313-bd8e-e323e10725ac 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Get console output#033[00m
Jan 20 10:16:11 np0005588919 nova_compute[225855]: 2026-01-20 15:16:11.907 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:16:12 np0005588919 nova_compute[225855]: 2026-01-20 15:16:12.759 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:12 np0005588919 nova_compute[225855]: 2026-01-20 15:16:12.962 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:13.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:13.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:13 np0005588919 nova_compute[225855]: 2026-01-20 15:16:13.898 225859 INFO nova.compute.manager [None req-98a0bc9a-d2ef-4c53-8d93-0b20dce27c54 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Get console output#033[00m
Jan 20 10:16:13 np0005588919 nova_compute[225855]: 2026-01-20 15:16:13.901 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:16:14 np0005588919 podman[305317]: 2026-01-20 15:16:14.039443068 +0000 UTC m=+0.086085233 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:16:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:15.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:15.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:16.438 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:16.438 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:16.439 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:17.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:17.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:17 np0005588919 nova_compute[225855]: 2026-01-20 15:16:17.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:17 np0005588919 nova_compute[225855]: 2026-01-20 15:16:17.964 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:19.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:19 np0005588919 nova_compute[225855]: 2026-01-20 15:16:19.683 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Check if temp file /var/lib/nova/instances/tmpwqr60y7t exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 20 10:16:19 np0005588919 nova_compute[225855]: 2026-01-20 15:16:19.685 225859 DEBUG nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwqr60y7t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='b5656c1b-5ac7-4b93-a25d-420e1e294678',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 20 10:16:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:21.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:22 np0005588919 nova_compute[225855]: 2026-01-20 15:16:22.771 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:22 np0005588919 nova_compute[225855]: 2026-01-20 15:16:22.966 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:23.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:23.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.751 225859 DEBUG nova.compute.manager [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.751 225859 DEBUG oslo_concurrency.lockutils [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.752 225859 DEBUG oslo_concurrency.lockutils [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.752 225859 DEBUG oslo_concurrency.lockutils [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.752 225859 DEBUG nova.compute.manager [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:24 np0005588919 nova_compute[225855]: 2026-01-20 15:16:24.752 225859 DEBUG nova.compute.manager [req-baadcd49-f99d-4409-b8cb-1898bcabb6b2 req-116f8276-7526-4b07-bf86-2841ad94df00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:16:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:25.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:25.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.734 225859 INFO nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Took 5.05 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.735 225859 DEBUG nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.752 225859 DEBUG nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwqr60y7t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='b5656c1b-5ac7-4b93-a25d-420e1e294678',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9b925066-c218-4b07-910d-90dd336bf952),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.755 225859 DEBUG nova.objects.instance [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lazy-loading 'migration_context' on Instance uuid b5656c1b-5ac7-4b93-a25d-420e1e294678 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.757 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.759 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.759 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.781 225859 DEBUG nova.virt.libvirt.vif [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-470752205',display_name='tempest-TestNetworkAdvancedServerOps-server-470752205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-470752205',id=190,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAmgIhRv7SylyyDoXsoPKcXNSjcMbMCF7/IROo74ZCmTo9LbWE2Sanv271vjV+ounImSggkddfPFTxsQsqOAeGJ63UOB1CVRLYAEgvPLI8ngnO4k9hlNWAKjL0F7Yejx9A==',key_name='tempest-TestNetworkAdvancedServerOps-131135683',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:15:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-kjhk913w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:15:50Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=b5656c1b-5ac7-4b93-a25d-420e1e294678,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.782 225859 DEBUG nova.network.os_vif_util [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converting VIF {"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.783 225859 DEBUG nova.network.os_vif_util [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.783 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updating guest XML with vif config: <interface type="ethernet">
Jan 20 10:16:25 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:a9:77:ea"/>
Jan 20 10:16:25 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 10:16:25 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:16:25 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 10:16:25 np0005588919 nova_compute[225855]:  <target dev="tapebbe6083-de"/>
Jan 20 10:16:25 np0005588919 nova_compute[225855]: </interface>
Jan 20 10:16:25 np0005588919 nova_compute[225855]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 20 10:16:25 np0005588919 nova_compute[225855]: 2026-01-20 15:16:25.784 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 20 10:16:26 np0005588919 nova_compute[225855]: 2026-01-20 15:16:26.262 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 10:16:26 np0005588919 nova_compute[225855]: 2026-01-20 15:16:26.262 225859 INFO nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 20 10:16:26 np0005588919 nova_compute[225855]: 2026-01-20 15:16:26.359 225859 INFO nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 20 10:16:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 20 10:16:26 np0005588919 nova_compute[225855]: 2026-01-20 15:16:26.861 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 10:16:26 np0005588919 nova_compute[225855]: 2026-01-20 15:16:26.862 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.066 225859 DEBUG nova.compute.manager [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.066 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.066 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.066 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.067 225859 DEBUG nova.compute.manager [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.067 225859 WARNING nova.compute.manager [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state migrating.#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.067 225859 DEBUG nova.compute.manager [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.067 225859 DEBUG nova.compute.manager [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing instance network info cache due to event network-changed-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.068 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.068 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.068 225859 DEBUG nova.network.neutron [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Refreshing network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:16:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:27.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.365 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.365 225859 DEBUG nova.virt.libvirt.migration [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 20 10:16:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:27.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.504 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922187.503702, b5656c1b-5ac7-4b93-a25d-420e1e294678 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.504 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.531 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.534 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.561 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 20 10:16:27 np0005588919 kernel: tapebbe6083-de (unregistering): left promiscuous mode
Jan 20 10:16:27 np0005588919 NetworkManager[49104]: <info>  [1768922187.6920] device (tapebbe6083-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:16:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:27Z|00846|binding|INFO|Releasing lport ebbe6083-de9d-43ca-9ab2-cf306ea0be4d from this chassis (sb_readonly=0)
Jan 20 10:16:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:27Z|00847|binding|INFO|Setting lport ebbe6083-de9d-43ca-9ab2-cf306ea0be4d down in Southbound
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.703 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:27Z|00848|binding|INFO|Removing iface tapebbe6083-de ovn-installed in OVS
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.708 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.710 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:77:ea 10.100.0.5'], port_security=['fa:16:3e:a9:77:ea 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '367c1a2c-b16a-4828-ab5a-626bb50023b4'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b5656c1b-5ac7-4b93-a25d-420e1e294678', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be008398-8f36-4967-9cc8-6412553c79f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1f7c21ab-d630-47d9-a822-01d8ee3b1d55', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.206'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cdafc2c8-f418-454c-b49a-dbb24d8d2298, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.712 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d in datapath be008398-8f36-4967-9cc8-6412553c79f3 unbound from our chassis#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.713 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network be008398-8f36-4967-9cc8-6412553c79f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.715 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0526d38d-2014-4215-8682-e374b55a8733]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.715 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3 namespace which is not needed anymore#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.720 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000be.scope: Deactivated successfully.
Jan 20 10:16:27 np0005588919 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000be.scope: Consumed 14.856s CPU time.
Jan 20 10:16:27 np0005588919 systemd-machined[194361]: Machine qemu-100-instance-000000be terminated.
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.773 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [NOTICE]   (305112) : haproxy version is 2.8.14-c23fe91
Jan 20 10:16:27 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [NOTICE]   (305112) : path to executable is /usr/sbin/haproxy
Jan 20 10:16:27 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [WARNING]  (305112) : Exiting Master process...
Jan 20 10:16:27 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [ALERT]    (305112) : Current worker (305116) exited with code 143 (Terminated)
Jan 20 10:16:27 np0005588919 neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3[305107]: [WARNING]  (305112) : All workers exited. Exiting... (0)
Jan 20 10:16:27 np0005588919 systemd[1]: libpod-ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f.scope: Deactivated successfully.
Jan 20 10:16:27 np0005588919 conmon[305107]: conmon ab35cb6b69ea79e9a1b0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f.scope/container/memory.events
Jan 20 10:16:27 np0005588919 virtqemud[225396]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/b5656c1b-5ac7-4b93-a25d-420e1e294678_disk: No such file or directory
Jan 20 10:16:27 np0005588919 virtqemud[225396]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/b5656c1b-5ac7-4b93-a25d-420e1e294678_disk: No such file or directory
Jan 20 10:16:27 np0005588919 podman[305426]: 2026-01-20 15:16:27.850481156 +0000 UTC m=+0.043374902 container died ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.858 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.863 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.869 225859 DEBUG nova.virt.libvirt.guest [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.870 225859 INFO nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migration operation has completed#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.871 225859 INFO nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] _post_live_migration() is started..#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.878 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 20 10:16:27 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:16:27 np0005588919 systemd[1]: var-lib-containers-storage-overlay-62df8bc671769c9f3b38059ef802cd2fd43ec5ee8411e92bf650967517928ab7-merged.mount: Deactivated successfully.
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.883 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.883 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 20 10:16:27 np0005588919 podman[305426]: 2026-01-20 15:16:27.890645895 +0000 UTC m=+0.083539641 container cleanup ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:16:27 np0005588919 systemd[1]: libpod-conmon-ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f.scope: Deactivated successfully.
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.928 225859 DEBUG nova.compute.manager [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.928 225859 DEBUG oslo_concurrency.lockutils [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.928 225859 DEBUG oslo_concurrency.lockutils [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.928 225859 DEBUG oslo_concurrency.lockutils [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.929 225859 DEBUG nova.compute.manager [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.929 225859 DEBUG nova.compute.manager [req-74b297c1-99b7-4c81-a622-9d433b32d090 req-5e9c3042-3d5c-48df-89fb-5a6d6fb803fc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:16:27 np0005588919 podman[305465]: 2026-01-20 15:16:27.952762148 +0000 UTC m=+0.040596983 container remove ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.958 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[75412e8e-d7a9-4727-85eb-638894d5e966]: (4, ('Tue Jan 20 03:16:27 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3 (ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f)\nab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f\nTue Jan 20 03:16:27 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3 (ab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f)\nab35cb6b69ea79e9a1b020ef3386e02f717ac202f0cbeea2b52a921e4f6d2e7f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.959 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3af62ac2-0dcb-46cf-a460-b3becd85ee68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.960 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe008398-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.961 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 kernel: tapbe008398-80: left promiscuous mode
Jan 20 10:16:27 np0005588919 nova_compute[225855]: 2026-01-20 15:16:27.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:27.981 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7fbecf-842e-4eb6-a944-59e56293d655]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:28.106 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7b97ed47-813a-487e-8c42-0022f9742a4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:28.107 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[60e4e794-682e-4196-bb96-0ba5d5d75ec6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:28.122 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dff27390-58ba-44d3-8ecc-eaa7be7eb18a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718799, 'reachable_time': 16028, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305482, 'error': None, 'target': 'ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:28 np0005588919 systemd[1]: run-netns-ovnmeta\x2dbe008398\x2d8f36\x2d4967\x2d9cc8\x2d6412553c79f3.mount: Deactivated successfully.
Jan 20 10:16:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:28.125 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-be008398-8f36-4967-9cc8-6412553c79f3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:16:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:28.126 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3edcbcfd-6545-46e5-a0e8-51734aa7af2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:16:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.100 225859 DEBUG nova.network.neutron [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updated VIF entry in instance network info cache for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.101 225859 DEBUG nova.network.neutron [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Updating instance_info_cache with network_info: [{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:29.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.131 225859 DEBUG oslo_concurrency.lockutils [req-e2371e9d-1e43-4664-ae25-506d85674ee2 req-fbc64c1e-eb84-4fd1-bc9e-1277f63d1f62 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-b5656c1b-5ac7-4b93-a25d-420e1e294678" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:29.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.941 225859 DEBUG nova.network.neutron [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Activated binding for port ebbe6083-de9d-43ca-9ab2-cf306ea0be4d and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.942 225859 DEBUG nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.943 225859 DEBUG nova.virt.libvirt.vif [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-470752205',display_name='tempest-TestNetworkAdvancedServerOps-server-470752205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-470752205',id=190,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAmgIhRv7SylyyDoXsoPKcXNSjcMbMCF7/IROo74ZCmTo9LbWE2Sanv271vjV+ounImSggkddfPFTxsQsqOAeGJ63UOB1CVRLYAEgvPLI8ngnO4k9hlNWAKjL0F7Yejx9A==',key_name='tempest-TestNetworkAdvancedServerOps-131135683',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:15:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-kjhk913w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:16:15Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=b5656c1b-5ac7-4b93-a25d-420e1e294678,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.943 225859 DEBUG nova.network.os_vif_util [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converting VIF {"id": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "address": "fa:16:3e:a9:77:ea", "network": {"id": "be008398-8f36-4967-9cc8-6412553c79f3", "bridge": "br-int", "label": "tempest-network-smoke--259749923", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebbe6083-de", "ovs_interfaceid": "ebbe6083-de9d-43ca-9ab2-cf306ea0be4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.943 225859 DEBUG nova.network.os_vif_util [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.944 225859 DEBUG os_vif [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.946 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebbe6083-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.947 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.950 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.952 225859 INFO os_vif [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:77:ea,bridge_name='br-int',has_traffic_filtering=True,id=ebbe6083-de9d-43ca-9ab2-cf306ea0be4d,network=Network(be008398-8f36-4967-9cc8-6412553c79f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebbe6083-de')#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.953 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.953 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.953 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.953 225859 DEBUG nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.954 225859 INFO nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Deleting instance files /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678_del#033[00m
Jan 20 10:16:29 np0005588919 nova_compute[225855]: 2026-01-20 15:16:29.954 225859 INFO nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Deletion of /var/lib/nova/instances/b5656c1b-5ac7-4b93-a25d-420e1e294678_del complete#033[00m
Jan 20 10:16:30 np0005588919 podman[305484]: 2026-01-20 15:16:30.010288088 +0000 UTC m=+0.055468905 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.047 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.047 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.047 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.047 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 WARNING nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state migrating.#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.048 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.049 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.049 225859 WARNING nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state migrating.#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.049 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.049 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.049 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.050 225859 DEBUG oslo_concurrency.lockutils [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.050 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:30 np0005588919 nova_compute[225855]: 2026-01-20 15:16:30.050 225859 DEBUG nova.compute.manager [req-8d4ad645-2470-43a7-9aba-b1c97acd08b5 req-d4712228-3f09-4525-9367-f5eb27524d53 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-unplugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:16:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:31.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:31.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.221 225859 DEBUG nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.221 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.221 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.221 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.221 225859 DEBUG nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 WARNING nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state migrating.#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 DEBUG nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 DEBUG oslo_concurrency.lockutils [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.222 225859 DEBUG nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] No waiting events found dispatching network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.223 225859 WARNING nova.compute.manager [req-2b858c79-1f31-4de4-8073-358f2e98d753 req-0cc8db62-b365-425b-8dd1-4c407516161a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Received unexpected event network-vif-plugged-ebbe6083-de9d-43ca-9ab2-cf306ea0be4d for instance with vm_state active and task_state migrating.#033[00m
Jan 20 10:16:32 np0005588919 nova_compute[225855]: 2026-01-20 15:16:32.984 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:33.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:33Z|00849|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 20 10:16:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:33.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 20 10:16:34 np0005588919 nova_compute[225855]: 2026-01-20 15:16:34.948 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:35.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:35.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.783 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.784 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.784 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "b5656c1b-5ac7-4b93-a25d-420e1e294678-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.801 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.802 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.802 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.802 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:16:35 np0005588919 nova_compute[225855]: 2026-01-20 15:16:35.803 225859 DEBUG oslo_concurrency.processutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2719345878' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.246 225859 DEBUG oslo_concurrency.processutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.328 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.329 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.329 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.506 225859 WARNING nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.507 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4011MB free_disk=20.760086059570312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.507 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.507 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.547 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Migration for instance b5656c1b-5ac7-4b93-a25d-420e1e294678 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.567 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.977 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.977 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Migration 9b925066-c218-4b07-910d-90dd336bf952 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.977 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:16:36 np0005588919 nova_compute[225855]: 2026-01-20 15:16:36.978 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.075 225859 DEBUG oslo_concurrency.processutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:16:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:37.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:37.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:16:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/512128344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.519 225859 DEBUG oslo_concurrency.processutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.525 225859 DEBUG nova.compute.provider_tree [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.549 225859 DEBUG nova.scheduler.client.report [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.576 225859 DEBUG nova.compute.resource_tracker [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.577 225859 DEBUG oslo_concurrency.lockutils [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.584 225859 INFO nova.compute.manager [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.684 225859 INFO nova.scheduler.client.report [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] Deleted allocation for migration 9b925066-c218-4b07-910d-90dd336bf952#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.684 225859 DEBUG nova.virt.libvirt.driver [None req-e9459132-ba13-41d9-a7cf-f952f09929e6 1998f6e29a51438c82e65b66da23d380 22d14e9a73254c8981e4a13fa61158c4 - - default default] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 20 10:16:37 np0005588919 nova_compute[225855]: 2026-01-20 15:16:37.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e408 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:39.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:39.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:39 np0005588919 nova_compute[225855]: 2026-01-20 15:16:39.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:40 np0005588919 nova_compute[225855]: 2026-01-20 15:16:40.831 225859 DEBUG nova.compute.manager [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:16:40 np0005588919 nova_compute[225855]: 2026-01-20 15:16:40.831 225859 DEBUG nova.compute.manager [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing instance network info cache due to event network-changed-0e93d1de-671e-4e37-8e79-44bed7981254. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:16:40 np0005588919 nova_compute[225855]: 2026-01-20 15:16:40.831 225859 DEBUG oslo_concurrency.lockutils [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:16:40 np0005588919 nova_compute[225855]: 2026-01-20 15:16:40.832 225859 DEBUG oslo_concurrency.lockutils [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:16:40 np0005588919 nova_compute[225855]: 2026-01-20 15:16:40.832 225859 DEBUG nova.network.neutron [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Refreshing network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:16:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:41.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:41.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 20 10:16:42 np0005588919 nova_compute[225855]: 2026-01-20 15:16:42.870 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922187.8690405, b5656c1b-5ac7-4b93-a25d-420e1e294678 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:16:42 np0005588919 nova_compute[225855]: 2026-01-20 15:16:42.870 225859 INFO nova.compute.manager [-] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.877775) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202877834, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2434, "num_deletes": 255, "total_data_size": 5483757, "memory_usage": 5560424, "flush_reason": "Manual Compaction"}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 20 10:16:42 np0005588919 nova_compute[225855]: 2026-01-20 15:16:42.893 225859 DEBUG nova.compute.manager [None req-f7e24bc1-ec39-4929-9e42-80bac89b93b5 - - - - - -] [instance: b5656c1b-5ac7-4b93-a25d-420e1e294678] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202901979, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3592662, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 66381, "largest_seqno": 68810, "table_properties": {"data_size": 3582980, "index_size": 6047, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20909, "raw_average_key_size": 20, "raw_value_size": 3563296, "raw_average_value_size": 3524, "num_data_blocks": 263, "num_entries": 1011, "num_filter_entries": 1011, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922008, "oldest_key_time": 1768922008, "file_creation_time": 1768922202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 24246 microseconds, and 8853 cpu microseconds.
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.902017) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3592662 bytes OK
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.902036) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904360) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904413) EVENT_LOG_v1 {"time_micros": 1768922202904402, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.904442) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5472980, prev total WAL file size 5472980, number of live WAL files 2.
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.905750) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3508KB)], [135(9718KB)]
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202905785, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 13544434, "oldest_snapshot_seqno": -1}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 9255 keys, 11673229 bytes, temperature: kUnknown
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202994010, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 11673229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11613200, "index_size": 35788, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 243101, "raw_average_key_size": 26, "raw_value_size": 11450388, "raw_average_value_size": 1237, "num_data_blocks": 1364, "num_entries": 9255, "num_filter_entries": 9255, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:16:42 np0005588919 nova_compute[225855]: 2026-01-20 15:16:42.995 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.994261) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 11673229 bytes
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.995997) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.4 rd, 132.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 9.5 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(7.0) write-amplify(3.2) OK, records in: 9780, records dropped: 525 output_compression: NoCompression
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.996058) EVENT_LOG_v1 {"time_micros": 1768922202996036, "job": 86, "event": "compaction_finished", "compaction_time_micros": 88303, "compaction_time_cpu_micros": 27975, "output_level": 6, "num_output_files": 1, "total_output_size": 11673229, "num_input_records": 9780, "num_output_records": 9255, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202996901, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922202999125, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.905682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.999234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.999241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.999243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.999245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:42 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:16:42.999247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:16:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:43.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:43.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:44 np0005588919 nova_compute[225855]: 2026-01-20 15:16:44.191 225859 DEBUG nova.network.neutron [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updated VIF entry in instance network info cache for port 0e93d1de-671e-4e37-8e79-44bed7981254. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:16:44 np0005588919 nova_compute[225855]: 2026-01-20 15:16:44.192 225859 DEBUG nova.network.neutron [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [{"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:16:44 np0005588919 nova_compute[225855]: 2026-01-20 15:16:44.251 225859 DEBUG oslo_concurrency.lockutils [req-bad45644-521c-4271-8d87-180ff0e8abd4 req-3c157dd2-3016-4cf1-8659-8e525fbba106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-f1ded131-d9a3-4e93-ad99-53ee2695d5c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:16:44 np0005588919 nova_compute[225855]: 2026-01-20 15:16:44.963 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:45 np0005588919 podman[305608]: 2026-01-20 15:16:45.021990571 +0000 UTC m=+0.074473614 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:16:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:45.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:45.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:47.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:47.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:47 np0005588919 nova_compute[225855]: 2026-01-20 15:16:47.998 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.586 225859 DEBUG oslo_concurrency.lockutils [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.587 225859 DEBUG oslo_concurrency.lockutils [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.600 225859 INFO nova.compute.manager [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Detaching volume 933c5c7a-f496-4bcc-b304-68156c235fe5#033[00m
Jan 20 10:16:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:16:48Z|00850|binding|INFO|Releasing lport b20b0e27-0b08-4316-b6df-6784416f44c0 from this chassis (sb_readonly=0)
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.688 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.787 225859 INFO nova.virt.block_device [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Attempting to driver detach volume 933c5c7a-f496-4bcc-b304-68156c235fe5 from mountpoint /dev/vdb#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.794 225859 DEBUG nova.virt.libvirt.driver [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Attempting to detach device vdb from instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.794 225859 DEBUG nova.virt.libvirt.guest [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-933c5c7a-f496-4bcc-b304-68156c235fe5">
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <serial>933c5c7a-f496-4bcc-b304-68156c235fe5</serial>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <shareable/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:16:48 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.800 225859 INFO nova.virt.libvirt.driver [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 from the persistent domain config.#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.800 225859 DEBUG nova.virt.libvirt.driver [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.801 225859 DEBUG nova.virt.libvirt.guest [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-933c5c7a-f496-4bcc-b304-68156c235fe5">
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <serial>933c5c7a-f496-4bcc-b304-68156c235fe5</serial>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <shareable/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 20 10:16:48 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:16:48 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.902 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768922208.9024026, f1ded131-d9a3-4e93-ad99-53ee2695d5c8 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.905 225859 DEBUG nova.virt.libvirt.driver [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:16:48 np0005588919 nova_compute[225855]: 2026-01-20 15:16:48.907 225859 INFO nova.virt.libvirt.driver [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully detached device vdb from instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 from the live domain config.#033[00m
Jan 20 10:16:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:49.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:49 np0005588919 nova_compute[225855]: 2026-01-20 15:16:49.298 225859 DEBUG nova.objects.instance [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'flavor' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:16:49 np0005588919 nova_compute[225855]: 2026-01-20 15:16:49.348 225859 DEBUG oslo_concurrency.lockutils [None req-1902b019-7a1b-4a94-924f-2ba27a4743a5 e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:16:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:49.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:49 np0005588919 nova_compute[225855]: 2026-01-20 15:16:49.966 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:50.536 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:16:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:50.537 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:16:50 np0005588919 nova_compute[225855]: 2026-01-20 15:16:50.536 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:16:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:51.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:16:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:51.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:53 np0005588919 nova_compute[225855]: 2026-01-20 15:16:53.000 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:53.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:53 np0005588919 nova_compute[225855]: 2026-01-20 15:16:53.386 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:53.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:54 np0005588919 nova_compute[225855]: 2026-01-20 15:16:54.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:54 np0005588919 nova_compute[225855]: 2026-01-20 15:16:54.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:16:54 np0005588919 nova_compute[225855]: 2026-01-20 15:16:54.968 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:55.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:55.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:16:55.538 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:16:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:57.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:57.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:16:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:16:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:16:57 np0005588919 nova_compute[225855]: 2026-01-20 15:16:57.826 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:58 np0005588919 nova_compute[225855]: 2026-01-20 15:16:58.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:16:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:16:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:16:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:16:59.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:16:59 np0005588919 nova_compute[225855]: 2026-01-20 15:16:59.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:16:59 np0005588919 nova_compute[225855]: 2026-01-20 15:16:59.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:16:59 np0005588919 nova_compute[225855]: 2026-01-20 15:16:59.434 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:16:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:16:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:16:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:16:59.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:16:59 np0005588919 nova_compute[225855]: 2026-01-20 15:16:59.970 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:00 np0005588919 nova_compute[225855]: 2026-01-20 15:17:00.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:00 np0005588919 ceph-osd[79119]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 20 10:17:01 np0005588919 podman[305775]: 2026-01-20 15:17:01.009564655 +0000 UTC m=+0.050919995 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:17:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:01.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:01 np0005588919 nova_compute[225855]: 2026-01-20 15:17:01.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:01 np0005588919 nova_compute[225855]: 2026-01-20 15:17:01.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:01.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:02 np0005588919 nova_compute[225855]: 2026-01-20 15:17:02.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.004 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:03.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.383 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.384 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.384 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.384 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.384 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:03.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.452 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.452 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.471 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.552 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.553 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.562 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.563 225859 INFO nova.compute.claims [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.667 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:17:03 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:17:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:03 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2580642773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.807 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.873 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:03 np0005588919 nova_compute[225855]: 2026-01-20 15:17:03.873 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.038 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.040 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3985MB free_disk=20.805618286132812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.040 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4051330510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.124 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.128 225859 DEBUG nova.compute.provider_tree [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.148 225859 DEBUG nova.scheduler.client.report [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.171 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.172 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.175 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.246 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.247 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.407 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.408 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 4f38d24a-3458-4c59-8480-8094ffcbb5aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.408 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.409 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.417 225859 INFO nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.468 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.540 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.631 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.633 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.634 225859 INFO nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Creating image(s)#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.660 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.685 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.716 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.720 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.790 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.792 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.792 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.793 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.818 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.822 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.866 225859 DEBUG nova.policy [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:17:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2900254021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.907 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.913 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.928 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.930 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.930 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:04 np0005588919 nova_compute[225855]: 2026-01-20 15:17:04.972 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.080 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.140 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:17:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:05.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.239 225859 DEBUG nova.objects.instance [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid 4f38d24a-3458-4c59-8480-8094ffcbb5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.254 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.254 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Ensure instance console log exists: /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.254 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.255 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.255 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:05.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:05 np0005588919 nova_compute[225855]: 2026-01-20 15:17:05.661 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Successfully created port: d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.059 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Successfully updated port: d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.077 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.077 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.078 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:17:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:07.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.213 225859 DEBUG nova.compute.manager [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.213 225859 DEBUG nova.compute.manager [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing instance network info cache due to event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.214 225859 DEBUG oslo_concurrency.lockutils [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.330 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:17:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:07.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:07 np0005588919 nova_compute[225855]: 2026-01-20 15:17:07.931 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.533 225859 DEBUG nova.network.neutron [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updating instance_info_cache with network_info: [{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.553 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.553 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Instance network_info: |[{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.553 225859 DEBUG oslo_concurrency.lockutils [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.554 225859 DEBUG nova.network.neutron [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.556 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Start _get_guest_xml network_info=[{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.561 225859 WARNING nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.565 225859 DEBUG nova.virt.libvirt.host [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.566 225859 DEBUG nova.virt.libvirt.host [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.569 225859 DEBUG nova.virt.libvirt.host [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.570 225859 DEBUG nova.virt.libvirt.host [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.572 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.572 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.573 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.573 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.573 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.573 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.574 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.574 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.574 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.574 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.574 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.575 225859 DEBUG nova.virt.hardware [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:17:08 np0005588919 nova_compute[225855]: 2026-01-20 15:17:08.578 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:17:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/630892543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.018 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.043 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.047 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:09.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:09.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:17:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1661905444' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.481 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.482 225859 DEBUG nova.virt.libvirt.vif [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2057447015',display_name='tempest-TestNetworkAdvancedServerOps-server-2057447015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2057447015',id=191,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPwC5MyMqyTrbrSBwBOpxBpSiLPbu1nzobGp6ktmxE+oIlgwGH9ZkqYZyAjLxwv50DDSq5iaSNQxNoKrNJWo+FdRObJJTJ5JQ9hbj5JsMfLfRRZUDmDAFS5rhSXxsMyYg==',key_name='tempest-TestNetworkAdvancedServerOps-386377958',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-foa05j4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:17:04Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4f38d24a-3458-4c59-8480-8094ffcbb5aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.483 225859 DEBUG nova.network.os_vif_util [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.484 225859 DEBUG nova.network.os_vif_util [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.485 225859 DEBUG nova.objects.instance [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f38d24a-3458-4c59-8480-8094ffcbb5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.499 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <uuid>4f38d24a-3458-4c59-8480-8094ffcbb5aa</uuid>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <name>instance-000000bf</name>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-2057447015</nova:name>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:17:08</nova:creationTime>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <nova:port uuid="d3a1fab4-7d4e-40cd-bdbb-b337196adbc5">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="serial">4f38d24a-3458-4c59-8480-8094ffcbb5aa</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="uuid">4f38d24a-3458-4c59-8480-8094ffcbb5aa</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:20:6b:ff"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <target dev="tapd3a1fab4-7d"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/console.log" append="off"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:17:09 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:17:09 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:17:09 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:17:09 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.501 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Preparing to wait for external event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.501 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.502 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.502 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.503 225859 DEBUG nova.virt.libvirt.vif [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2057447015',display_name='tempest-TestNetworkAdvancedServerOps-server-2057447015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2057447015',id=191,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPwC5MyMqyTrbrSBwBOpxBpSiLPbu1nzobGp6ktmxE+oIlgwGH9ZkqYZyAjLxwv50DDSq5iaSNQxNoKrNJWo+FdRObJJTJ5JQ9hbj5JsMfLfRRZUDmDAFS5rhSXxsMyYg==',key_name='tempest-TestNetworkAdvancedServerOps-386377958',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-foa05j4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:17:04Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4f38d24a-3458-4c59-8480-8094ffcbb5aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.503 225859 DEBUG nova.network.os_vif_util [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.504 225859 DEBUG nova.network.os_vif_util [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.504 225859 DEBUG os_vif [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.506 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.506 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.509 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.510 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3a1fab4-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.510 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3a1fab4-7d, col_values=(('external_ids', {'iface-id': 'd3a1fab4-7d4e-40cd-bdbb-b337196adbc5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:6b:ff', 'vm-uuid': '4f38d24a-3458-4c59-8480-8094ffcbb5aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.511 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:09 np0005588919 NetworkManager[49104]: <info>  [1768922229.5127] manager: (tapd3a1fab4-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.514 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.520 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.521 225859 INFO os_vif [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d')#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.567 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.568 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.568 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:20:6b:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.569 225859 INFO nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Using config drive#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.599 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.954 225859 INFO nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Creating config drive at /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config#033[00m
Jan 20 10:17:09 np0005588919 nova_compute[225855]: 2026-01-20 15:17:09.959 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp107hlhkm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.095 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp107hlhkm" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.121 225859 DEBUG nova.storage.rbd_utils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.124 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.152 225859 DEBUG nova.network.neutron [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updated VIF entry in instance network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.153 225859 DEBUG nova.network.neutron [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updating instance_info_cache with network_info: [{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.169 225859 DEBUG oslo_concurrency.lockutils [req-3a457955-0c78-4aed-99d2-34e09e8a012d req-2c4a8951-cfd2-46a4-9ed0-73a56d2e357e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.311 225859 DEBUG oslo_concurrency.processutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config 4f38d24a-3458-4c59-8480-8094ffcbb5aa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.312 225859 INFO nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Deleting local config drive /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa/disk.config because it was imported into RBD.#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:10 np0005588919 kernel: tapd3a1fab4-7d: entered promiscuous mode
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.3616] manager: (tapd3a1fab4-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Jan 20 10:17:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:10Z|00851|binding|INFO|Claiming lport d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 for this chassis.
Jan 20 10:17:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:10Z|00852|binding|INFO|d3a1fab4-7d4e-40cd-bdbb-b337196adbc5: Claiming fa:16:3e:20:6b:ff 10.100.0.9
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.369 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:6b:ff 10.100.0.9'], port_security=['fa:16:3e:20:6b:ff 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f38d24a-3458-4c59-8480-8094ffcbb5aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4512e7e3-1668-4e98-8240-843256180395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9f2f56b-e7c7-475c-b1af-94303aad79ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0147113d-42da-4762-ae51-c60dbf8c0dd2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.370 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 in datapath 4512e7e3-1668-4e98-8240-843256180395 bound to our chassis#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.371 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4512e7e3-1668-4e98-8240-843256180395#033[00m
Jan 20 10:17:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:10Z|00853|binding|INFO|Setting lport d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 ovn-installed in OVS
Jan 20 10:17:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:10Z|00854|binding|INFO|Setting lport d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 up in Southbound
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.379 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.381 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5dafce14-7887-4752-8a44-f01980a09709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.382 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4512e7e3-11 in ovnmeta-4512e7e3-1668-4e98-8240-843256180395 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.384 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.383 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4512e7e3-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.384 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8690532a-4829-4164-8813-cce223c42e50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.385 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[be09299e-e72b-45bf-a3d9-5b8146e23569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 systemd-udevd[306267]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.395 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[eecd0db3-284d-4c23-8dcf-45b4d25b0b75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 systemd-machined[194361]: New machine qemu-101-instance-000000bf.
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.4024] device (tapd3a1fab4-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.4034] device (tapd3a1fab4-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.409 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[949a998c-7ec1-43e7-8dfb-188aaeda62c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 systemd[1]: Started Virtual Machine qemu-101-instance-000000bf.
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.435 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[68f43f74-60f3-4c00-bf77-2f893d3c26de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 systemd-udevd[306270]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.4409] manager: (tap4512e7e3-10): new Veth device (/org/freedesktop/NetworkManager/Devices/356)
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.440 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bff152f5-b7e7-4232-ad3c-d3b0abf82787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.473 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f08db15f-5352-461c-9f5c-d50234d4b55d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.475 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c7f4f7-9f50-4bcb-8407-3c631579e548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.4956] device (tap4512e7e3-10): carrier: link connected
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.500 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ecf6bc-d47d-4cfa-ba15-cd35655c9870]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.516 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15e8ec1e-4edb-4ce8-b6d9-03b4e2b2e1fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4512e7e3-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:e6:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726942, 'reachable_time': 26848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306298, 'error': None, 'target': 'ovnmeta-4512e7e3-1668-4e98-8240-843256180395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.529 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99a4138b-44b6-4cf7-aa97-037d9ae2e65d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:e6d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 726942, 'tstamp': 726942}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306299, 'error': None, 'target': 'ovnmeta-4512e7e3-1668-4e98-8240-843256180395', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.544 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4b8fee-2d11-4514-af70-545e04c9aef6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4512e7e3-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:e6:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726942, 'reachable_time': 26848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306300, 'error': None, 'target': 'ovnmeta-4512e7e3-1668-4e98-8240-843256180395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.567 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7daab9f3-1220-4d42-b820-933cb49174fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.601 225859 DEBUG nova.compute.manager [req-15ab5063-6d81-4fc8-bec7-e77680b454fc req-832952ca-769a-423f-a49e-b0734d99d421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.602 225859 DEBUG oslo_concurrency.lockutils [req-15ab5063-6d81-4fc8-bec7-e77680b454fc req-832952ca-769a-423f-a49e-b0734d99d421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.602 225859 DEBUG oslo_concurrency.lockutils [req-15ab5063-6d81-4fc8-bec7-e77680b454fc req-832952ca-769a-423f-a49e-b0734d99d421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.602 225859 DEBUG oslo_concurrency.lockutils [req-15ab5063-6d81-4fc8-bec7-e77680b454fc req-832952ca-769a-423f-a49e-b0734d99d421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.602 225859 DEBUG nova.compute.manager [req-15ab5063-6d81-4fc8-bec7-e77680b454fc req-832952ca-769a-423f-a49e-b0734d99d421 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Processing event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.620 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5f591dac-ec3e-483c-b9e5-eb70f60d605d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.621 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4512e7e3-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.621 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.622 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4512e7e3-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.623 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588919 NetworkManager[49104]: <info>  [1768922230.6245] manager: (tap4512e7e3-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 20 10:17:10 np0005588919 kernel: tap4512e7e3-10: entered promiscuous mode
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.626 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4512e7e3-10, col_values=(('external_ids', {'iface-id': '3415a30d-278b-4c15-be57-f12804d2b50c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:10 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:10Z|00855|binding|INFO|Releasing lport 3415a30d-278b-4c15-be57-f12804d2b50c from this chassis (sb_readonly=0)
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.641 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4512e7e3-1668-4e98-8240-843256180395.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4512e7e3-1668-4e98-8240-843256180395.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.643 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[af41a529-d02e-44f1-9fd6-c8e10cf07650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.644 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-4512e7e3-1668-4e98-8240-843256180395
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/4512e7e3-1668-4e98-8240-843256180395.pid.haproxy
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 4512e7e3-1668-4e98-8240-843256180395
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:17:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:10.644 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4512e7e3-1668-4e98-8240-843256180395', 'env', 'PROCESS_TAG=haproxy-4512e7e3-1668-4e98-8240-843256180395', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4512e7e3-1668-4e98-8240-843256180395.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.880 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922230.8800206, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.880 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Started (Lifecycle Event)#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.883 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.888 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.891 225859 INFO nova.virt.libvirt.driver [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Instance spawned successfully.#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.891 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.899 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.901 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.911 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.912 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.912 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.913 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.913 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.914 225859 DEBUG nova.virt.libvirt.driver [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.924 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.924 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922230.8830743, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.924 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.956 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.960 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922230.8855433, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.960 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.978 225859 INFO nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Took 6.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.979 225859 DEBUG nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.981 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:10 np0005588919 nova_compute[225855]: 2026-01-20 15:17:10.987 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:11 np0005588919 podman[306374]: 2026-01-20 15:17:11.013330407 +0000 UTC m=+0.057116521 container create e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 10:17:11 np0005588919 nova_compute[225855]: 2026-01-20 15:17:11.022 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:17:11 np0005588919 nova_compute[225855]: 2026-01-20 15:17:11.064 225859 INFO nova.compute.manager [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Took 7.54 seconds to build instance.#033[00m
Jan 20 10:17:11 np0005588919 nova_compute[225855]: 2026-01-20 15:17:11.079 225859 DEBUG oslo_concurrency.lockutils [None req-18fdf58e-e22e-47f3-bcd4-3c053d6c44ca 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:11 np0005588919 podman[306374]: 2026-01-20 15:17:10.990489779 +0000 UTC m=+0.034275913 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:17:11 np0005588919 systemd[1]: Started libpod-conmon-e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce.scope.
Jan 20 10:17:11 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:17:11 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7810c804e6835b7fafc97e2298621921e0f31c1da0eaafddfa82d1449fccd1e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:17:11 np0005588919 podman[306374]: 2026-01-20 15:17:11.130103001 +0000 UTC m=+0.173889145 container init e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:17:11 np0005588919 podman[306374]: 2026-01-20 15:17:11.137166221 +0000 UTC m=+0.180952335 container start e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 20 10:17:11 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [NOTICE]   (306393) : New worker (306395) forked
Jan 20 10:17:11 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [NOTICE]   (306393) : Loading success.
Jan 20 10:17:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:11.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:11.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.689 225859 DEBUG nova.compute.manager [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.690 225859 DEBUG oslo_concurrency.lockutils [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.690 225859 DEBUG oslo_concurrency.lockutils [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.690 225859 DEBUG oslo_concurrency.lockutils [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.690 225859 DEBUG nova.compute.manager [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] No waiting events found dispatching network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:12 np0005588919 nova_compute[225855]: 2026-01-20 15:17:12.691 225859 WARNING nova.compute.manager [req-328b45a0-2136-40bd-93b6-bf15a9a6247c req-16a73e29-54b3-4ed3-b3d9-f147e46b61ce 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received unexpected event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:17:13 np0005588919 nova_compute[225855]: 2026-01-20 15:17:13.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:13.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:13.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:17:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1693259685' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:17:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:17:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1693259685' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.849 225859 DEBUG nova.compute.manager [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.849 225859 DEBUG nova.compute.manager [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing instance network info cache due to event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.849 225859 DEBUG oslo_concurrency.lockutils [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.849 225859 DEBUG oslo_concurrency.lockutils [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:14 np0005588919 nova_compute[225855]: 2026-01-20 15:17:14.849 225859 DEBUG nova.network.neutron [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:17:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:15.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:15.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 20 10:17:16 np0005588919 podman[306407]: 2026-01-20 15:17:16.036480472 +0000 UTC m=+0.084963812 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:16.439 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:16.439 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:16.440 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 20 10:17:17 np0005588919 nova_compute[225855]: 2026-01-20 15:17:17.069 225859 DEBUG nova.network.neutron [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updated VIF entry in instance network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:17:17 np0005588919 nova_compute[225855]: 2026-01-20 15:17:17.070 225859 DEBUG nova.network.neutron [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updating instance_info_cache with network_info: [{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:17 np0005588919 nova_compute[225855]: 2026-01-20 15:17:17.088 225859 DEBUG oslo_concurrency.lockutils [req-048d33d9-813e-4b04-ad2d-2ff2f9ebbd77 req-d742de8c-8460-4452-90a4-d1c37bd20db4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:17.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:17.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:18 np0005588919 nova_compute[225855]: 2026-01-20 15:17:18.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:18 np0005588919 nova_compute[225855]: 2026-01-20 15:17:18.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:19.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:19.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:19 np0005588919 nova_compute[225855]: 2026-01-20 15:17:19.562 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:21.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:21.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:17:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 60K writes, 233K keys, 60K commit groups, 1.0 writes per commit group, ingest: 0.22 GB, 0.05 MB/s#012Cumulative WAL: 60K writes, 22K syncs, 2.65 writes per sync, written: 0.22 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8255 writes, 31K keys, 8255 commit groups, 1.0 writes per commit group, ingest: 29.33 MB, 0.05 MB/s#012Interval WAL: 8255 writes, 3312 syncs, 2.49 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:17:23 np0005588919 nova_compute[225855]: 2026-01-20 15:17:23.013 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:23.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e411 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:23.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:24 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:24Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:6b:ff 10.100.0.9
Jan 20 10:17:24 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:24Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:6b:ff 10.100.0.9
Jan 20 10:17:24 np0005588919 nova_compute[225855]: 2026-01-20 15:17:24.564 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 20 10:17:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:25.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:25.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:27.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 20 10:17:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:28 np0005588919 nova_compute[225855]: 2026-01-20 15:17:28.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:29.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:29.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:29 np0005588919 nova_compute[225855]: 2026-01-20 15:17:29.607 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.658 225859 INFO nova.compute.manager [None req-11b20fbe-a70c-479b-baae-7120a8121549 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Get console output#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.664 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.950 225859 INFO nova.compute.manager [None req-4014658b-d089-43f2-9dc5-9d47a60b1e0a 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Pausing#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.951 225859 DEBUG nova.objects.instance [None req-4014658b-d089-43f2-9dc5-9d47a60b1e0a 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'flavor' on Instance uuid 4f38d24a-3458-4c59-8480-8094ffcbb5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.988 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922250.9882522, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.989 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.990 225859 DEBUG nova.compute.manager [None req-4014658b-d089-43f2-9dc5-9d47a60b1e0a 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:30 np0005588919 nova_compute[225855]: 2026-01-20 15:17:30.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:30.993 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:30.995 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:17:31 np0005588919 nova_compute[225855]: 2026-01-20 15:17:31.026 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:31 np0005588919 nova_compute[225855]: 2026-01-20 15:17:31.028 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:31 np0005588919 nova_compute[225855]: 2026-01-20 15:17:31.071 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 20 10:17:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:31.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:31.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 20 10:17:32 np0005588919 podman[306491]: 2026-01-20 15:17:32.01648817 +0000 UTC m=+0.055536697 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.627 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.628 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.628 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.628 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.628 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.629 225859 INFO nova.compute.manager [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Terminating instance#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.630 225859 DEBUG nova.compute.manager [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:17:32 np0005588919 kernel: tap0e93d1de-67 (unregistering): left promiscuous mode
Jan 20 10:17:32 np0005588919 NetworkManager[49104]: <info>  [1768922252.6747] device (tap0e93d1de-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:17:32 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:32Z|00856|binding|INFO|Releasing lport 0e93d1de-671e-4e37-8e79-44bed7981254 from this chassis (sb_readonly=0)
Jan 20 10:17:32 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:32Z|00857|binding|INFO|Setting lport 0e93d1de-671e-4e37-8e79-44bed7981254 down in Southbound
Jan 20 10:17:32 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:32Z|00858|binding|INFO|Removing iface tap0e93d1de-67 ovn-installed in OVS
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.683 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.685 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.692 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:5e:ed 10.100.0.3'], port_security=['fa:16:3e:99:5e:ed 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1ded131-d9a3-4e93-ad99-53ee2695d5c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fff727019f86407498e83d7948d54962', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5ace6a2f-56c6-4679-bb81-70ccb27ab312', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87d69a20-7690-494a-ac16-7c600840561a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=0e93d1de-671e-4e37-8e79-44bed7981254) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.693 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 0e93d1de-671e-4e37-8e79-44bed7981254 in datapath c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab unbound from our chassis#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.694 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.695 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7abbcf0b-8193-4a4e-be8f-d1a5770ed658]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.696 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab namespace which is not needed anymore#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.699 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Jan 20 10:17:32 np0005588919 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000bb.scope: Consumed 17.614s CPU time.
Jan 20 10:17:32 np0005588919 systemd-machined[194361]: Machine qemu-99-instance-000000bb terminated.
Jan 20 10:17:32 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [NOTICE]   (304722) : haproxy version is 2.8.14-c23fe91
Jan 20 10:17:32 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [NOTICE]   (304722) : path to executable is /usr/sbin/haproxy
Jan 20 10:17:32 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [WARNING]  (304722) : Exiting Master process...
Jan 20 10:17:32 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [ALERT]    (304722) : Current worker (304724) exited with code 143 (Terminated)
Jan 20 10:17:32 np0005588919 neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab[304717]: [WARNING]  (304722) : All workers exited. Exiting... (0)
Jan 20 10:17:32 np0005588919 systemd[1]: libpod-a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259.scope: Deactivated successfully.
Jan 20 10:17:32 np0005588919 podman[306533]: 2026-01-20 15:17:32.843278479 +0000 UTC m=+0.050865094 container died a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.865 225859 INFO nova.virt.libvirt.driver [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Instance destroyed successfully.#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.866 225859 DEBUG nova.objects.instance [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lazy-loading 'resources' on Instance uuid f1ded131-d9a3-4e93-ad99-53ee2695d5c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:32 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259-userdata-shm.mount: Deactivated successfully.
Jan 20 10:17:32 np0005588919 systemd[1]: var-lib-containers-storage-overlay-cb008d1ba0088a00e33db5fdd52a317528e9420e9947ad1e20b1f6e8e7235013-merged.mount: Deactivated successfully.
Jan 20 10:17:32 np0005588919 podman[306533]: 2026-01-20 15:17:32.89088399 +0000 UTC m=+0.098470615 container cleanup a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.891 225859 DEBUG nova.virt.libvirt.vif [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:14:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=187,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL5L2o6o5dLcQyaIfhCZ5CKxQlecqNGmP68oHIQEsVoKIC2qfrMKjObT9GdMU8oznX9LVUwIWCShhlEJu9ZqPiutEL2afEJ1hQQamjERNcx9wWS2NfOgykA4yugQphfOtA==',key_name='tempest-keypair-1568469072',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:15:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fff727019f86407498e83d7948d54962',ramdisk_id='',reservation_id='r-h927541v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-418194625',owner_user_name='tempest-AttachVolumeMultiAttachTest-418194625-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:15:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e9cc4ce3e069479ba9c789b378a68a1d',uuid=f1ded131-d9a3-4e93-ad99-53ee2695d5c8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.892 225859 DEBUG nova.network.os_vif_util [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converting VIF {"id": "0e93d1de-671e-4e37-8e79-44bed7981254", "address": "fa:16:3e:99:5e:ed", "network": {"id": "c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1423306001-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fff727019f86407498e83d7948d54962", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e93d1de-67", "ovs_interfaceid": "0e93d1de-671e-4e37-8e79-44bed7981254", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.893 225859 DEBUG nova.network.os_vif_util [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.894 225859 DEBUG os_vif [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.896 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.896 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e93d1de-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:32 np0005588919 systemd[1]: libpod-conmon-a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259.scope: Deactivated successfully.
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.900 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.904 225859 INFO os_vif [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:5e:ed,bridge_name='br-int',has_traffic_filtering=True,id=0e93d1de-671e-4e37-8e79-44bed7981254,network=Network(c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e93d1de-67')#033[00m
Jan 20 10:17:32 np0005588919 podman[306575]: 2026-01-20 15:17:32.95574982 +0000 UTC m=+0.042950549 container remove a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.963 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7159670f-b479-4da7-92cb-b3d8d6c35832]: (4, ('Tue Jan 20 03:17:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab (a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259)\na439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259\nTue Jan 20 03:17:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab (a439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259)\na439ab3be6e54a4c6d705368f861d75da7d479824ae2fd478adfd2b64bc06259\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.965 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cfb054-6c31-40f1-b0b9-cfcca559c481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.966 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1f4a971-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.968 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 kernel: tapc1f4a971-00: left promiscuous mode
Jan 20 10:17:32 np0005588919 nova_compute[225855]: 2026-01-20 15:17:32.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.984 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[25a74703-86be-4da1-9708-a1f61edc7eab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.996 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.997 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe210d1-8c90-47c9-8373-7587d9fd83cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:32.998 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b19bbf-5509-4a20-949e-6dc29609c6f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:33.015 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb05141-9a37-48b0-b29f-79c46d7e6ed0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718295, 'reachable_time': 26716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306609, 'error': None, 'target': 'ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:33.019 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1f4a971-0bd7-41ce-bdf6-5acb2b1b4bab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:33.019 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[17d577be-bea8-436e-8b20-ad6f77447e0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:33 np0005588919 systemd[1]: run-netns-ovnmeta\x2dc1f4a971\x2d0bd7\x2d41ce\x2dbdf6\x2d5acb2b1b4bab.mount: Deactivated successfully.
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.043 225859 DEBUG nova.compute.manager [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.043 225859 DEBUG oslo_concurrency.lockutils [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.045 225859 DEBUG oslo_concurrency.lockutils [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.045 225859 DEBUG oslo_concurrency.lockutils [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.045 225859 DEBUG nova.compute.manager [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.045 225859 DEBUG nova.compute.manager [req-eeb9de47-d613-4b05-9cec-e42ecc6a7fbd req-ccb5514f-a2de-48bd-93f3-a3f5d62fd5b4 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-unplugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:17:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:33.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.320 225859 INFO nova.virt.libvirt.driver [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Deleting instance files /var/lib/nova/instances/f1ded131-d9a3-4e93-ad99-53ee2695d5c8_del#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.322 225859 INFO nova.virt.libvirt.driver [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Deletion of /var/lib/nova/instances/f1ded131-d9a3-4e93-ad99-53ee2695d5c8_del complete#033[00m
Jan 20 10:17:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e414 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:33.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.815 225859 INFO nova.compute.manager [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Took 1.18 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.816 225859 DEBUG oslo.service.loopingcall [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.816 225859 DEBUG nova.compute.manager [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:17:33 np0005588919 nova_compute[225855]: 2026-01-20 15:17:33.817 225859 DEBUG nova.network.neutron [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.294 225859 INFO nova.compute.manager [None req-504840e6-6cce-4d9f-9319-c681f6ed5fc8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Get console output#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.302 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.482 225859 INFO nova.compute.manager [None req-cae5a840-e40e-499d-812b-c11e11f159ae 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Unpausing#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.484 225859 DEBUG nova.objects.instance [None req-cae5a840-e40e-499d-812b-c11e11f159ae 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'flavor' on Instance uuid 4f38d24a-3458-4c59-8480-8094ffcbb5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.514 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922254.5136197, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.514 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:17:34 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.519 225859 DEBUG nova.virt.libvirt.guest [None req-cae5a840-e40e-499d-812b-c11e11f159ae 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.519 225859 DEBUG nova.compute.manager [None req-cae5a840-e40e-499d-812b-c11e11f159ae 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.546 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.550 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.584 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.661 225859 DEBUG nova.network.neutron [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.677 225859 INFO nova.compute.manager [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Took 0.86 seconds to deallocate network for instance.#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.727 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.728 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.743 225859 DEBUG nova.compute.manager [req-96087b76-7c97-4fac-bfa4-8059dafbac9c req-919d7204-fa65-45c7-bd9e-3e0bf8e85106 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-deleted-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:34 np0005588919 nova_compute[225855]: 2026-01-20 15:17:34.800 225859 DEBUG oslo_concurrency.processutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.153 225859 DEBUG nova.compute.manager [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.154 225859 DEBUG oslo_concurrency.lockutils [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.154 225859 DEBUG oslo_concurrency.lockutils [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.154 225859 DEBUG oslo_concurrency.lockutils [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.154 225859 DEBUG nova.compute.manager [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] No waiting events found dispatching network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.155 225859 WARNING nova.compute.manager [req-987b788c-5af9-45c8-bc34-f16a7311b8a8 req-fcbca577-c859-4281-81a7-13ef4f764aa6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Received unexpected event network-vif-plugged-0e93d1de-671e-4e37-8e79-44bed7981254 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:17:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:35.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3058422568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.253 225859 DEBUG oslo_concurrency.processutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.260 225859 DEBUG nova.compute.provider_tree [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.285 225859 DEBUG nova.scheduler.client.report [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.336 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.382 225859 INFO nova.scheduler.client.report [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Deleted allocations for instance f1ded131-d9a3-4e93-ad99-53ee2695d5c8#033[00m
Jan 20 10:17:35 np0005588919 nova_compute[225855]: 2026-01-20 15:17:35.462 225859 DEBUG oslo_concurrency.lockutils [None req-8a54e417-8d3c-4bf9-889d-182b61853b5d e9cc4ce3e069479ba9c789b378a68a1d fff727019f86407498e83d7948d54962 - - default default] Lock "f1ded131-d9a3-4e93-ad99-53ee2695d5c8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:35.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:17:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1154417224' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:17:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:17:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1154417224' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:17:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 20 10:17:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:37.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:37 np0005588919 nova_compute[225855]: 2026-01-20 15:17:37.285 225859 INFO nova.compute.manager [None req-e8dfab0f-c800-4f16-a2ef-bbcce3513308 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Get console output#033[00m
Jan 20 10:17:37 np0005588919 nova_compute[225855]: 2026-01-20 15:17:37.291 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:17:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:37.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:37 np0005588919 nova_compute[225855]: 2026-01-20 15:17:37.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.332 225859 DEBUG nova.compute.manager [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.333 225859 DEBUG nova.compute.manager [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing instance network info cache due to event network-changed-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.333 225859 DEBUG oslo_concurrency.lockutils [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.333 225859 DEBUG oslo_concurrency.lockutils [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.334 225859 DEBUG nova.network.neutron [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Refreshing network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:17:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.413 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.413 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.413 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.413 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.414 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.415 225859 INFO nova.compute.manager [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Terminating instance#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.416 225859 DEBUG nova.compute.manager [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:17:38 np0005588919 kernel: tapd3a1fab4-7d (unregistering): left promiscuous mode
Jan 20 10:17:38 np0005588919 NetworkManager[49104]: <info>  [1768922258.4858] device (tapd3a1fab4-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:17:38 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:38Z|00859|binding|INFO|Releasing lport d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 from this chassis (sb_readonly=0)
Jan 20 10:17:38 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:38Z|00860|binding|INFO|Setting lport d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 down in Southbound
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 ovn_controller[130490]: 2026-01-20T15:17:38Z|00861|binding|INFO|Removing iface tapd3a1fab4-7d ovn-installed in OVS
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.498 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:6b:ff 10.100.0.9'], port_security=['fa:16:3e:20:6b:ff 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4f38d24a-3458-4c59-8480-8094ffcbb5aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4512e7e3-1668-4e98-8240-843256180395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9f2f56b-e7c7-475c-b1af-94303aad79ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0147113d-42da-4762-ae51-c60dbf8c0dd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.499 140354 INFO neutron.agent.ovn.metadata.agent [-] Port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 in datapath 4512e7e3-1668-4e98-8240-843256180395 unbound from our chassis#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.500 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4512e7e3-1668-4e98-8240-843256180395, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.501 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a5997726-2c09-4d4a-9285-b7b32e0657f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.502 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4512e7e3-1668-4e98-8240-843256180395 namespace which is not needed anymore#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000bf.scope: Deactivated successfully.
Jan 20 10:17:38 np0005588919 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000bf.scope: Consumed 13.944s CPU time.
Jan 20 10:17:38 np0005588919 systemd-machined[194361]: Machine qemu-101-instance-000000bf terminated.
Jan 20 10:17:38 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [NOTICE]   (306393) : haproxy version is 2.8.14-c23fe91
Jan 20 10:17:38 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [NOTICE]   (306393) : path to executable is /usr/sbin/haproxy
Jan 20 10:17:38 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [WARNING]  (306393) : Exiting Master process...
Jan 20 10:17:38 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [ALERT]    (306393) : Current worker (306395) exited with code 143 (Terminated)
Jan 20 10:17:38 np0005588919 neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395[306389]: [WARNING]  (306393) : All workers exited. Exiting... (0)
Jan 20 10:17:38 np0005588919 systemd[1]: libpod-e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce.scope: Deactivated successfully.
Jan 20 10:17:38 np0005588919 podman[306661]: 2026-01-20 15:17:38.635754383 +0000 UTC m=+0.046683676 container died e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.651 225859 INFO nova.virt.libvirt.driver [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Instance destroyed successfully.#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.651 225859 DEBUG nova.objects.instance [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 4f38d24a-3458-4c59-8480-8094ffcbb5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:17:38 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce-userdata-shm.mount: Deactivated successfully.
Jan 20 10:17:38 np0005588919 systemd[1]: var-lib-containers-storage-overlay-e7810c804e6835b7fafc97e2298621921e0f31c1da0eaafddfa82d1449fccd1e-merged.mount: Deactivated successfully.
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.672 225859 DEBUG nova.virt.libvirt.vif [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2057447015',display_name='tempest-TestNetworkAdvancedServerOps-server-2057447015',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2057447015',id=191,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPwC5MyMqyTrbrSBwBOpxBpSiLPbu1nzobGp6ktmxE+oIlgwGH9ZkqYZyAjLxwv50DDSq5iaSNQxNoKrNJWo+FdRObJJTJ5JQ9hbj5JsMfLfRRZUDmDAFS5rhSXxsMyYg==',key_name='tempest-TestNetworkAdvancedServerOps-386377958',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:17:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-foa05j4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:17:34Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4f38d24a-3458-4c59-8480-8094ffcbb5aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.673 225859 DEBUG nova.network.os_vif_util [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.673 225859 DEBUG nova.network.os_vif_util [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.674 225859 DEBUG os_vif [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:17:38 np0005588919 podman[306661]: 2026-01-20 15:17:38.676120897 +0000 UTC m=+0.087050190 container cleanup e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.676 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.677 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3a1fab4-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 systemd[1]: libpod-conmon-e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce.scope: Deactivated successfully.
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.727 225859 INFO os_vif [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:6b:ff,bridge_name='br-int',has_traffic_filtering=True,id=d3a1fab4-7d4e-40cd-bdbb-b337196adbc5,network=Network(4512e7e3-1668-4e98-8240-843256180395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3a1fab4-7d')#033[00m
Jan 20 10:17:38 np0005588919 podman[306701]: 2026-01-20 15:17:38.73613803 +0000 UTC m=+0.040652214 container remove e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.738 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae5c318-1c66-4e48-9ce1-a5fac5791f65]: (4, ('Tue Jan 20 03:17:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395 (e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce)\ne60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce\nTue Jan 20 03:17:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4512e7e3-1668-4e98-8240-843256180395 (e60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce)\ne60f633dc8d25180e6187f9fd54256e0e27093030965e6a20f3d5598fbea10ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.740 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[23075dba-bf29-4b74-8718-3d7fdf869613]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.741 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4512e7e3-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:17:38 np0005588919 kernel: tap4512e7e3-10: left promiscuous mode
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.746 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 nova_compute[225855]: 2026-01-20 15:17:38.757 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.760 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5d8bec-a388-4f43-be91-b8b51cfcbbee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.775 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ff35ea24-931d-4e20-a90b-4b36910c2bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.776 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ac41e65e-5619-4ca8-bc86-57b07c42f708]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.792 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfabdc8-d2de-4794-a0be-e3802e131ef1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726935, 'reachable_time': 34330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306734, 'error': None, 'target': 'ovnmeta-4512e7e3-1668-4e98-8240-843256180395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:38 np0005588919 systemd[1]: run-netns-ovnmeta\x2d4512e7e3\x2d1668\x2d4e98\x2d8240\x2d843256180395.mount: Deactivated successfully.
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.794 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4512e7e3-1668-4e98-8240-843256180395 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:17:38 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:17:38.795 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6a8b0f-adb3-4b19-a5d3-870cfd83275a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:17:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:39.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.462 225859 INFO nova.virt.libvirt.driver [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Deleting instance files /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa_del#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.463 225859 INFO nova.virt.libvirt.driver [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Deletion of /var/lib/nova/instances/4f38d24a-3458-4c59-8480-8094ffcbb5aa_del complete#033[00m
Jan 20 10:17:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:39.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.521 225859 INFO nova.compute.manager [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.522 225859 DEBUG oslo.service.loopingcall [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.522 225859 DEBUG nova.compute.manager [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.522 225859 DEBUG nova.network.neutron [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.728 225859 DEBUG nova.network.neutron [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updated VIF entry in instance network info cache for port d3a1fab4-7d4e-40cd-bdbb-b337196adbc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.729 225859 DEBUG nova.network.neutron [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updating instance_info_cache with network_info: [{"id": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "address": "fa:16:3e:20:6b:ff", "network": {"id": "4512e7e3-1668-4e98-8240-843256180395", "bridge": "br-int", "label": "tempest-network-smoke--434600618", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3a1fab4-7d", "ovs_interfaceid": "d3a1fab4-7d4e-40cd-bdbb-b337196adbc5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:39 np0005588919 nova_compute[225855]: 2026-01-20 15:17:39.751 225859 DEBUG oslo_concurrency.lockutils [req-072b75bc-eee6-4b63-b059-6894187d9313 req-aad19ad7-7ce0-402e-b026-17f3b61374c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4f38d24a-3458-4c59-8480-8094ffcbb5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.062 225859 DEBUG nova.network.neutron [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.084 225859 INFO nova.compute.manager [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Took 0.56 seconds to deallocate network for instance.#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.156 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.156 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.200 225859 DEBUG oslo_concurrency.processutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.565 225859 DEBUG nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-vif-unplugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.566 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.566 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.566 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.566 225859 DEBUG nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] No waiting events found dispatching network-vif-unplugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.566 225859 WARNING nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received unexpected event network-vif-unplugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 DEBUG nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 DEBUG oslo_concurrency.lockutils [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 DEBUG nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] No waiting events found dispatching network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.567 225859 WARNING nova.compute.manager [req-8c67f0ba-14cf-46e8-8818-c41feb18a543 req-cf345f5f-0ad5-4654-877e-3240e6bcab88 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received unexpected event network-vif-plugged-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:17:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:17:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3071866721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.676 225859 DEBUG oslo_concurrency.processutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.684 225859 DEBUG nova.compute.provider_tree [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.706 225859 DEBUG nova.scheduler.client.report [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.729 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.930 225859 INFO nova.scheduler.client.report [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance 4f38d24a-3458-4c59-8480-8094ffcbb5aa#033[00m
Jan 20 10:17:40 np0005588919 nova_compute[225855]: 2026-01-20 15:17:40.997 225859 DEBUG oslo_concurrency.lockutils [None req-e22fce45-8ff1-4e21-ae11-6ec524f9b68d 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4f38d24a-3458-4c59-8480-8094ffcbb5aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:17:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:41.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:41.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:41 np0005588919 nova_compute[225855]: 2026-01-20 15:17:41.770 225859 DEBUG nova.compute.manager [req-03de410d-9cad-41ed-b307-4538a94ce33d req-069102a6-e658-4613-a128-87e1a562f364 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Received event network-vif-deleted-d3a1fab4-7d4e-40cd-bdbb-b337196adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:17:43 np0005588919 nova_compute[225855]: 2026-01-20 15:17:43.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:43 np0005588919 nova_compute[225855]: 2026-01-20 15:17:43.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:45.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:46 np0005588919 nova_compute[225855]: 2026-01-20 15:17:46.980 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:47 np0005588919 podman[306813]: 2026-01-20 15:17:47.103962794 +0000 UTC m=+0.147610019 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:17:47 np0005588919 nova_compute[225855]: 2026-01-20 15:17:47.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:47.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:47 np0005588919 nova_compute[225855]: 2026-01-20 15:17:47.863 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922252.8621945, f1ded131-d9a3-4e93-ad99-53ee2695d5c8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:47 np0005588919 nova_compute[225855]: 2026-01-20 15:17:47.864 225859 INFO nova.compute.manager [-] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:17:47 np0005588919 nova_compute[225855]: 2026-01-20 15:17:47.895 225859 DEBUG nova.compute.manager [None req-6b3f2232-0298-4c86-98cf-d5f63d88e1a3 - - - - - -] [instance: f1ded131-d9a3-4e93-ad99-53ee2695d5c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:48 np0005588919 nova_compute[225855]: 2026-01-20 15:17:48.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:48 np0005588919 nova_compute[225855]: 2026-01-20 15:17:48.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:49.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:51.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:51.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:53 np0005588919 nova_compute[225855]: 2026-01-20 15:17:53.029 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:53.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:53.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:53 np0005588919 nova_compute[225855]: 2026-01-20 15:17:53.649 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922258.648655, 4f38d24a-3458-4c59-8480-8094ffcbb5aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:17:53 np0005588919 nova_compute[225855]: 2026-01-20 15:17:53.650 225859 INFO nova.compute.manager [-] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:17:53 np0005588919 nova_compute[225855]: 2026-01-20 15:17:53.667 225859 DEBUG nova.compute.manager [None req-5d729218-ed0d-4f2d-993e-61f024a9f9fa - - - - - -] [instance: 4f38d24a-3458-4c59-8480-8094ffcbb5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:17:53 np0005588919 nova_compute[225855]: 2026-01-20 15:17:53.748 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1456547554' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1456547554' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:17:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 69K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1715 writes, 8442 keys, 1715 commit groups, 1.0 writes per commit group, ingest: 16.52 MB, 0.03 MB/s#012Interval WAL: 1715 writes, 1715 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     79.4      1.05              0.28        43    0.024       0      0       0.0       0.0#012  L6      1/0   11.13 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9    104.2     88.6      4.65              1.34        42    0.111    293K    22K       0.0       0.0#012 Sum      1/0   11.13 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     84.9     86.9      5.70              1.62        85    0.067    293K    22K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     98.5     98.6      0.79              0.26        12    0.066     56K   3149       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    104.2     88.6      4.65              1.34        42    0.111    293K    22K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     79.6      1.05              0.28        42    0.025       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.082, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.48 GB write, 0.10 MB/s write, 0.47 GB read, 0.10 MB/s read, 5.7 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 54.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000531 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3135,52.31 MB,17.2083%) FilterBlock(85,821.23 KB,0.263811%) IndexBlock(85,1.35 MB,0.444026%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:17:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:55.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:55.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:56 np0005588919 nova_compute[225855]: 2026-01-20 15:17:56.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:17:56 np0005588919 nova_compute[225855]: 2026-01-20 15:17:56.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:17:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:57.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:17:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:17:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:57.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:17:58 np0005588919 nova_compute[225855]: 2026-01-20 15:17:58.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:17:58 np0005588919 nova_compute[225855]: 2026-01-20 15:17:58.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:17:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:17:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:17:59.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:17:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:17:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:17:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:17:59.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:00 np0005588919 nova_compute[225855]: 2026-01-20 15:18:00.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:00 np0005588919 nova_compute[225855]: 2026-01-20 15:18:00.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:18:00 np0005588919 nova_compute[225855]: 2026-01-20 15:18:00.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:18:00 np0005588919 nova_compute[225855]: 2026-01-20 15:18:00.508 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:18:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:01.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:01.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:02 np0005588919 nova_compute[225855]: 2026-01-20 15:18:02.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:02 np0005588919 nova_compute[225855]: 2026-01-20 15:18:02.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:02 np0005588919 podman[306869]: 2026-01-20 15:18:02.639908503 +0000 UTC m=+0.055378382 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:18:03 np0005588919 nova_compute[225855]: 2026-01-20 15:18:03.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:03.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:03 np0005588919 nova_compute[225855]: 2026-01-20 15:18:03.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:03.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:03 np0005588919 nova_compute[225855]: 2026-01-20 15:18:03.751 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.376 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:18:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:18:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:04 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/593386575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.841 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.992 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.994 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4249MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.994 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:04 np0005588919 nova_compute[225855]: 2026-01-20 15:18:04.994 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:05 np0005588919 nova_compute[225855]: 2026-01-20 15:18:05.050 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:18:05 np0005588919 nova_compute[225855]: 2026-01-20 15:18:05.050 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:18:05 np0005588919 nova_compute[225855]: 2026-01-20 15:18:05.069 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:05.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:05.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3326096982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:05 np0005588919 nova_compute[225855]: 2026-01-20 15:18:05.582 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:05 np0005588919 nova_compute[225855]: 2026-01-20 15:18:05.588 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:18:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:07.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:07.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:07 np0005588919 nova_compute[225855]: 2026-01-20 15:18:07.571 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:18:07 np0005588919 nova_compute[225855]: 2026-01-20 15:18:07.617 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:18:07 np0005588919 nova_compute[225855]: 2026-01-20 15:18:07.618 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:08 np0005588919 nova_compute[225855]: 2026-01-20 15:18:08.034 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:08 np0005588919 nova_compute[225855]: 2026-01-20 15:18:08.788 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:09.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:09.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:09 np0005588919 nova_compute[225855]: 2026-01-20 15:18:09.618 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:09 np0005588919 nova_compute[225855]: 2026-01-20 15:18:09.619 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:10 np0005588919 nova_compute[225855]: 2026-01-20 15:18:10.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:18:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:11.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:11.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:13 np0005588919 nova_compute[225855]: 2026-01-20 15:18:13.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:13.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:13.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:13 np0005588919 nova_compute[225855]: 2026-01-20 15:18:13.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:15.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:15.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:16.439 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:16.440 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:16.440 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:17.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:17.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:18 np0005588919 nova_compute[225855]: 2026-01-20 15:18:18.036 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:18 np0005588919 podman[307147]: 2026-01-20 15:18:18.040717497 +0000 UTC m=+0.088863021 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 20 10:18:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:18 np0005588919 nova_compute[225855]: 2026-01-20 15:18:18.791 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:19.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:19.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:21.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:21.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:23 np0005588919 nova_compute[225855]: 2026-01-20 15:18:23.038 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:23.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:23.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:23 np0005588919 nova_compute[225855]: 2026-01-20 15:18:23.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:25.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:25.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:27.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:27.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:28 np0005588919 nova_compute[225855]: 2026-01-20 15:18:28.040 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:28 np0005588919 nova_compute[225855]: 2026-01-20 15:18:28.796 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:29Z|00862|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 20 10:18:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:29.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:29.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:31.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:31.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.911669) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311911771, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 1434, "num_deletes": 258, "total_data_size": 3019222, "memory_usage": 3059816, "flush_reason": "Manual Compaction"}
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311923473, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 1979813, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68816, "largest_seqno": 70244, "table_properties": {"data_size": 1973758, "index_size": 3257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13517, "raw_average_key_size": 20, "raw_value_size": 1961240, "raw_average_value_size": 2905, "num_data_blocks": 144, "num_entries": 675, "num_filter_entries": 675, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922204, "oldest_key_time": 1768922204, "file_creation_time": 1768922311, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 11835 microseconds, and 4697 cpu microseconds.
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.923503) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 1979813 bytes OK
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.923519) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.924423) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.924434) EVENT_LOG_v1 {"time_micros": 1768922311924430, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.924450) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 3012430, prev total WAL file size 3012430, number of live WAL files 2.
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.925191) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353230' seq:72057594037927935, type:22 .. '6C6F676D0032373732' seq:0, type:0; will stop at (end)
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(1933KB)], [138(11MB)]
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311925221, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13653042, "oldest_snapshot_seqno": -1}
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 9397 keys, 13514620 bytes, temperature: kUnknown
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922311995958, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 13514620, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13451444, "index_size": 38551, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 247143, "raw_average_key_size": 26, "raw_value_size": 13284059, "raw_average_value_size": 1413, "num_data_blocks": 1478, "num_entries": 9397, "num_filter_entries": 9397, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922311, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:18:31 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.996227) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 13514620 bytes
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.002105) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.8 rd, 190.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.1 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(13.7) write-amplify(6.8) OK, records in: 9930, records dropped: 533 output_compression: NoCompression
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.002145) EVENT_LOG_v1 {"time_micros": 1768922312002132, "job": 88, "event": "compaction_finished", "compaction_time_micros": 70816, "compaction_time_cpu_micros": 30981, "output_level": 6, "num_output_files": 1, "total_output_size": 13514620, "num_input_records": 9930, "num_output_records": 9397, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922312002780, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922312005203, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:31.925123) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.005233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.005237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.005238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.005240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:32 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:18:32.005241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:18:33 np0005588919 podman[307230]: 2026-01-20 15:18:33.003636927 +0000 UTC m=+0.051494672 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:33.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.416 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.417 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.436 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.512 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.513 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.533 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.534 225859 INFO nova.compute.claims [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:18:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:33.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.670 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:33 np0005588919 nova_compute[225855]: 2026-01-20 15:18:33.797 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:18:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2134992538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.096 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.101 225859 DEBUG nova.compute.provider_tree [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.126 225859 DEBUG nova.scheduler.client.report [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.147 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.148 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.218 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.219 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.268 225859 INFO nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.289 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.434 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.436 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.436 225859 INFO nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Creating image(s)#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.731 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.758 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.786 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.789 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.821 225859 DEBUG nova.policy [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd9a8f26b71f4631a387e555e6b18428', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9156c0a9920c4721843416b9a44404f9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.857 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.858 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.859 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.859 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:34 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.887 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:34.999 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.261 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:35.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.316 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] resizing rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.395 225859 DEBUG nova.objects.instance [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.419 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.419 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Ensure instance console log exists: /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.420 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.420 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:35 np0005588919 nova_compute[225855]: 2026-01-20 15:18:35.420 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:35.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:36.916 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:36 np0005588919 nova_compute[225855]: 2026-01-20 15:18:36.916 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:36.917 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:18:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:37.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:37.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:37 np0005588919 nova_compute[225855]: 2026-01-20 15:18:37.886 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Successfully created port: bd99d3a5-54e0-4e70-9a02-3543631281a6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:18:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:37.919 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:38 np0005588919 nova_compute[225855]: 2026-01-20 15:18:38.044 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:38 np0005588919 nova_compute[225855]: 2026-01-20 15:18:38.799 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:39.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:39.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.591 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Successfully updated port: bd99d3a5-54e0-4e70-9a02-3543631281a6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.615 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.615 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquired lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.615 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.705 225859 DEBUG nova.compute.manager [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-changed-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.706 225859 DEBUG nova.compute.manager [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Refreshing instance network info cache due to event network-changed-bd99d3a5-54e0-4e70-9a02-3543631281a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.706 225859 DEBUG oslo_concurrency.lockutils [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:39 np0005588919 nova_compute[225855]: 2026-01-20 15:18:39.928 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.123 225859 DEBUG nova.network.neutron [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.153 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Releasing lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.153 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Instance network_info: |[{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.153 225859 DEBUG oslo_concurrency.lockutils [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.154 225859 DEBUG nova.network.neutron [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Refreshing network info cache for port bd99d3a5-54e0-4e70-9a02-3543631281a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.156 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Start _get_guest_xml network_info=[{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.161 225859 WARNING nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.166 225859 DEBUG nova.virt.libvirt.host [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.166 225859 DEBUG nova.virt.libvirt.host [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.169 225859 DEBUG nova.virt.libvirt.host [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.170 225859 DEBUG nova.virt.libvirt.host [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.171 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.172 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.172 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.172 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.172 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.173 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.173 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.173 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.173 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.174 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.174 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.174 225859 DEBUG nova.virt.hardware [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.177 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:41.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:18:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2986932180' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.591 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.618 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:41 np0005588919 nova_compute[225855]: 2026-01-20 15:18:41.623 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:18:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2338561239' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.050 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.052 225859 DEBUG nova.virt.libvirt.vif [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:18:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1816292644',display_name='tempest-AttachVolumeNegativeTest-server-1816292644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1816292644',id=194,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw8qLSULQliDvwKbeTpbt4J6LHSzE0FcgbAkXI3mB449DNtZV5vtYZtKqW3qflHvvMvcmL7nd1rBiXHEUgRgW71fE/QnzR597lXioriSvOlFWdxXwkMYduhCAWqw/sG6A==',key_name='tempest-keypair-235858913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-acjnzwzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:18:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=2b31f3d7-81bd-4712-bcb1-98afd2dc0f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.052 225859 DEBUG nova.network.os_vif_util [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.053 225859 DEBUG nova.network.os_vif_util [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.054 225859 DEBUG nova.objects.instance [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.072 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <uuid>2b31f3d7-81bd-4712-bcb1-98afd2dc0f44</uuid>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <name>instance-000000c2</name>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1816292644</nova:name>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:18:41</nova:creationTime>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:user uuid="cd9a8f26b71f4631a387e555e6b18428">tempest-AttachVolumeNegativeTest-1505789262-project-member</nova:user>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:project uuid="9156c0a9920c4721843416b9a44404f9">tempest-AttachVolumeNegativeTest-1505789262</nova:project>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <nova:port uuid="bd99d3a5-54e0-4e70-9a02-3543631281a6">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="serial">2b31f3d7-81bd-4712-bcb1-98afd2dc0f44</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="uuid">2b31f3d7-81bd-4712-bcb1-98afd2dc0f44</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:44:23:fd"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <target dev="tapbd99d3a5-54"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/console.log" append="off"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:18:42 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:18:42 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:18:42 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:18:42 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.074 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Preparing to wait for external event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.074 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.074 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.074 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.075 225859 DEBUG nova.virt.libvirt.vif [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:18:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1816292644',display_name='tempest-AttachVolumeNegativeTest-server-1816292644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1816292644',id=194,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw8qLSULQliDvwKbeTpbt4J6LHSzE0FcgbAkXI3mB449DNtZV5vtYZtKqW3qflHvvMvcmL7nd1rBiXHEUgRgW71fE/QnzR597lXioriSvOlFWdxXwkMYduhCAWqw/sG6A==',key_name='tempest-keypair-235858913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-acjnzwzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:18:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=2b31f3d7-81bd-4712-bcb1-98afd2dc0f44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.075 225859 DEBUG nova.network.os_vif_util [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.076 225859 DEBUG nova.network.os_vif_util [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.076 225859 DEBUG os_vif [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.076 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.077 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.077 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.080 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd99d3a5-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.081 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbd99d3a5-54, col_values=(('external_ids', {'iface-id': 'bd99d3a5-54e0-4e70-9a02-3543631281a6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:23:fd', 'vm-uuid': '2b31f3d7-81bd-4712-bcb1-98afd2dc0f44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588919 NetworkManager[49104]: <info>  [1768922322.0850] manager: (tapbd99d3a5-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.091 225859 INFO os_vif [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54')#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.153 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.153 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.153 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:44:23:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.154 225859 INFO nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Using config drive#033[00m
Jan 20 10:18:42 np0005588919 nova_compute[225855]: 2026-01-20 15:18:42.182 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:43.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.564 225859 INFO nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Creating config drive at /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config#033[00m
Jan 20 10:18:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:43.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.569 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp9vnxc9g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.708 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp9vnxc9g" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.736 225859 DEBUG nova.storage.rbd_utils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.740 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.913 225859 DEBUG oslo_concurrency.processutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.914 225859 INFO nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Deleting local config drive /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44/disk.config because it was imported into RBD.#033[00m
Jan 20 10:18:43 np0005588919 kernel: tapbd99d3a5-54: entered promiscuous mode
Jan 20 10:18:43 np0005588919 NetworkManager[49104]: <info>  [1768922323.9634] manager: (tapbd99d3a5-54): new Tun device (/org/freedesktop/NetworkManager/Devices/359)
Jan 20 10:18:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:43Z|00863|binding|INFO|Claiming lport bd99d3a5-54e0-4e70-9a02-3543631281a6 for this chassis.
Jan 20 10:18:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:43Z|00864|binding|INFO|bd99d3a5-54e0-4e70-9a02-3543631281a6: Claiming fa:16:3e:44:23:fd 10.100.0.13
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.964 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588919 nova_compute[225855]: 2026-01-20 15:18:43.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.980 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:23:fd 10.100.0.13'], port_security=['fa:16:3e:44:23:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2b31f3d7-81bd-4712-bcb1-98afd2dc0f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd945b282-623c-4da9-a940-ac04c971b57b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=bd99d3a5-54e0-4e70-9a02-3543631281a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.981 140354 INFO neutron.agent.ovn.metadata.agent [-] Port bd99d3a5-54e0-4e70-9a02-3543631281a6 in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a bound to our chassis#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.982 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a#033[00m
Jan 20 10:18:43 np0005588919 systemd-machined[194361]: New machine qemu-102-instance-000000c2.
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.994 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5b08f621-da96-4265-a683-a0bc73825283]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.995 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap76c2d716-71 in ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.996 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap76c2d716-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.996 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[faecfa86-f32b-4e39-b900-8ee036c9c549]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:43.997 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb1e272-1c56-4123-8dca-da3dc141cd39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.008 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[e243bb0f-26dc-4fde-8041-26c4003f91fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.028 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 systemd[1]: Started Virtual Machine qemu-102-instance-000000c2.
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.033 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.034 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d33b9485-dfbf-4cae-abfd-04c8305ef411]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:44Z|00865|binding|INFO|Setting lport bd99d3a5-54e0-4e70-9a02-3543631281a6 ovn-installed in OVS
Jan 20 10:18:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:44Z|00866|binding|INFO|Setting lport bd99d3a5-54e0-4e70-9a02-3543631281a6 up in Southbound
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.039 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 systemd-udevd[307631]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:18:44 np0005588919 NetworkManager[49104]: <info>  [1768922324.0543] device (tapbd99d3a5-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:18:44 np0005588919 NetworkManager[49104]: <info>  [1768922324.0551] device (tapbd99d3a5-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.065 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5e96c39b-5fff-4ec0-b163-54b1370a2a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.070 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc456f5-0cdd-4d90-9822-8bd33c59e038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 NetworkManager[49104]: <info>  [1768922324.0714] manager: (tap76c2d716-70): new Veth device (/org/freedesktop/NetworkManager/Devices/360)
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.102 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[2c807fa5-1e4c-472f-a8c6-c887f38df508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.105 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d4ed49-ee05-4850-ae4e-b2c2665322ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 NetworkManager[49104]: <info>  [1768922324.1247] device (tap76c2d716-70): carrier: link connected
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.129 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ea39e7-b3df-4bde-b60c-d1d4ac562725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.145 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4f9515-9fc7-41fb-a205-7659e1a579a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736305, 'reachable_time': 19770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307661, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.164 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e789b55f-105a-4590-9c88-adbfc6c3ba47]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:44ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736305, 'tstamp': 736305}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307662, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d125f67e-d8f1-429f-a105-53927c91b7dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 245], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736305, 'reachable_time': 19770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307663, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.217 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[94834a92-1cc9-45fd-bddb-2502b491ab58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.272 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd648a2-7d3a-4c04-bf01-6f8e093b5afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.273 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.273 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.274 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76c2d716-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.275 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 NetworkManager[49104]: <info>  [1768922324.2761] manager: (tap76c2d716-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 20 10:18:44 np0005588919 kernel: tap76c2d716-70: entered promiscuous mode
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.277 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.278 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76c2d716-70, col_values=(('external_ids', {'iface-id': '2c0bba0e-e9b6-4ece-8349-62642b94d91d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.279 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:44Z|00867|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.295 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.296 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.297 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[57704dc8-1030-488e-874f-5829e3d2407a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.298 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:18:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:18:44.300 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'env', 'PROCESS_TAG=haproxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.369 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922324.368992, 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.369 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] VM Started (Lifecycle Event)#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.466 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.471 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922324.3691351, 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.471 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.538 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.542 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:44 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.586 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:18:44 np0005588919 podman[307737]: 2026-01-20 15:18:44.63535801 +0000 UTC m=+0.050022173 container create 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:18:44 np0005588919 systemd[1]: Started libpod-conmon-5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7.scope.
Jan 20 10:18:44 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:18:44 np0005588919 podman[307737]: 2026-01-20 15:18:44.610623411 +0000 UTC m=+0.025287374 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:18:44 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7b4fcd771e27f8b153744804f05a30b7615c256cc9d63d2f67498c5cbee4f4e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:18:44 np0005588919 podman[307737]: 2026-01-20 15:18:44.720240615 +0000 UTC m=+0.134904548 container init 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:18:44 np0005588919 podman[307737]: 2026-01-20 15:18:44.725963726 +0000 UTC m=+0.140627659 container start 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:18:44 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [NOTICE]   (307756) : New worker (307758) forked
Jan 20 10:18:44 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [NOTICE]   (307756) : Loading success.
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.999 225859 DEBUG nova.compute.manager [req-dcea4326-d029-43cc-a973-ac713a700454 req-e0d8c5ff-8711-4796-b8b1-c559c8c0e4ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:44.999 225859 DEBUG oslo_concurrency.lockutils [req-dcea4326-d029-43cc-a973-ac713a700454 req-e0d8c5ff-8711-4796-b8b1-c559c8c0e4ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.000 225859 DEBUG oslo_concurrency.lockutils [req-dcea4326-d029-43cc-a973-ac713a700454 req-e0d8c5ff-8711-4796-b8b1-c559c8c0e4ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.000 225859 DEBUG oslo_concurrency.lockutils [req-dcea4326-d029-43cc-a973-ac713a700454 req-e0d8c5ff-8711-4796-b8b1-c559c8c0e4ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.000 225859 DEBUG nova.compute.manager [req-dcea4326-d029-43cc-a973-ac713a700454 req-e0d8c5ff-8711-4796-b8b1-c559c8c0e4ef 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Processing event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.001 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.004 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922325.00469, 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.005 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.006 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.010 225859 INFO nova.virt.libvirt.driver [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Instance spawned successfully.#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.010 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.036 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.038 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.038 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.038 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.039 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.039 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.040 225859 DEBUG nova.virt.libvirt.driver [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.045 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.087 225859 DEBUG nova.network.neutron [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updated VIF entry in instance network info cache for port bd99d3a5-54e0-4e70-9a02-3543631281a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.087 225859 DEBUG nova.network.neutron [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.123 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.169 225859 DEBUG oslo_concurrency.lockutils [req-51605533-b7f0-4242-9d13-054db74dd614 req-24ae14e8-fdfd-494a-9899-0c18fd537a5c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.198 225859 INFO nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Took 10.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.199 225859 DEBUG nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.277 225859 INFO nova.compute.manager [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Took 11.79 seconds to build instance.#033[00m
Jan 20 10:18:45 np0005588919 nova_compute[225855]: 2026-01-20 15:18:45.309 225859 DEBUG oslo_concurrency.lockutils [None req-bf228ae6-1931-4ebb-8ef0-3778a1908505 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:18:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:45.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:18:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:45.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.082 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.144 225859 DEBUG nova.compute.manager [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.145 225859 DEBUG oslo_concurrency.lockutils [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.145 225859 DEBUG oslo_concurrency.lockutils [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.145 225859 DEBUG oslo_concurrency.lockutils [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.145 225859 DEBUG nova.compute.manager [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] No waiting events found dispatching network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:18:47 np0005588919 nova_compute[225855]: 2026-01-20 15:18:47.146 225859 WARNING nova.compute.manager [req-3f702505-207e-4d05-ab7c-5ef6075e758c req-e77c0c8a-a2a5-42ed-bc11-a5836889e298 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received unexpected event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:18:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:47.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:47.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:48 np0005588919 nova_compute[225855]: 2026-01-20 15:18:48.048 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:48 np0005588919 NetworkManager[49104]: <info>  [1768922328.4408] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 20 10:18:48 np0005588919 NetworkManager[49104]: <info>  [1768922328.4415] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 20 10:18:48 np0005588919 nova_compute[225855]: 2026-01-20 15:18:48.440 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:48 np0005588919 nova_compute[225855]: 2026-01-20 15:18:48.516 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:48Z|00868|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:18:48 np0005588919 nova_compute[225855]: 2026-01-20 15:18:48.529 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:49 np0005588919 podman[307770]: 2026-01-20 15:18:49.029637174 +0000 UTC m=+0.077473887 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:18:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:49.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:49 np0005588919 nova_compute[225855]: 2026-01-20 15:18:49.561 225859 DEBUG nova.compute.manager [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-changed-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:18:49 np0005588919 nova_compute[225855]: 2026-01-20 15:18:49.562 225859 DEBUG nova.compute.manager [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Refreshing instance network info cache due to event network-changed-bd99d3a5-54e0-4e70-9a02-3543631281a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:18:49 np0005588919 nova_compute[225855]: 2026-01-20 15:18:49.562 225859 DEBUG oslo_concurrency.lockutils [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:18:49 np0005588919 nova_compute[225855]: 2026-01-20 15:18:49.562 225859 DEBUG oslo_concurrency.lockutils [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:18:49 np0005588919 nova_compute[225855]: 2026-01-20 15:18:49.562 225859 DEBUG nova.network.neutron [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Refreshing network info cache for port bd99d3a5-54e0-4e70-9a02-3543631281a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:18:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:49.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 20 10:18:49 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 20 10:18:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:51.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:51 np0005588919 nova_compute[225855]: 2026-01-20 15:18:51.353 225859 DEBUG nova.network.neutron [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updated VIF entry in instance network info cache for port bd99d3a5-54e0-4e70-9a02-3543631281a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:18:51 np0005588919 nova_compute[225855]: 2026-01-20 15:18:51.353 225859 DEBUG nova.network.neutron [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:18:51 np0005588919 nova_compute[225855]: 2026-01-20 15:18:51.394 225859 DEBUG oslo_concurrency.lockutils [req-1f9601fb-1d10-4b0a-b38d-3f9aebe84f58 req-5284322e-f98e-4fa9-bb30-355eff4d1fc8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:18:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:51.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:52 np0005588919 nova_compute[225855]: 2026-01-20 15:18:52.083 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:53 np0005588919 nova_compute[225855]: 2026-01-20 15:18:53.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:53.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:55.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:57 np0005588919 nova_compute[225855]: 2026-01-20 15:18:57.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:57.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:18:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:57.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:18:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:57Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:23:fd 10.100.0.13
Jan 20 10:18:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:18:57Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:23:fd 10.100.0.13
Jan 20 10:18:58 np0005588919 nova_compute[225855]: 2026-01-20 15:18:58.052 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:18:58 np0005588919 nova_compute[225855]: 2026-01-20 15:18:58.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:18:58 np0005588919 nova_compute[225855]: 2026-01-20 15:18:58.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:18:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:18:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:18:59.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:18:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:18:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:18:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:18:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:19:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:01.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:01.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.887 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.888 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.888 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:19:01 np0005588919 nova_compute[225855]: 2026-01-20 15:19:01.888 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:02 np0005588919 nova_compute[225855]: 2026-01-20 15:19:02.086 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:03 np0005588919 podman[307854]: 2026-01-20 15:19:03.165625213 +0000 UTC m=+0.052380249 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Jan 20 10:19:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:03.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.460 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.482 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.483 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.483 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:03 np0005588919 nova_compute[225855]: 2026-01-20 15:19:03.484 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:05 np0005588919 nova_compute[225855]: 2026-01-20 15:19:05.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:05.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:05.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.391 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.391 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.391 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3146294823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.820 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.894 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:19:06 np0005588919 nova_compute[225855]: 2026-01-20 15:19:06.895 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.038 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.039 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.043 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.045 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4077MB free_disk=20.897098541259766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.045 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.045 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.053 225859 DEBUG nova.objects.instance [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.112 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.169 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.170 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.170 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.196 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.240 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.241 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.281 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.334 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:19:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:07.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.417 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:07.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.607 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.608 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.609 225859 INFO nova.compute.manager [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Attaching volume 9f3cabb8-d51f-4db9-97b3-c5b764893ee2 to /dev/vdb#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.840 225859 DEBUG os_brick.utils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.842 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.853 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.853 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[508418f6-bd0c-4f63-9106-0a2b71094171]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:19:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/222608440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.855 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.863 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.863 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2f6834-4335-4dda-b7cb-09424e226a92]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.865 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.874 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.874 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[17759ec5-00cc-49fc-8bab-7dd2d493eaf7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.876 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.876 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[0e775201-dea4-41eb-a5ef-6b1709ec86dd]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.877 225859 DEBUG oslo_concurrency.processutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.906 225859 DEBUG oslo_concurrency.processutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.910 225859 DEBUG os_brick.initiator.connectors.lightos [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.910 225859 DEBUG os_brick.initiator.connectors.lightos [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.911 225859 DEBUG os_brick.initiator.connectors.lightos [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.911 225859 DEBUG os_brick.utils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.911 225859 DEBUG nova.virt.block_device [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating existing volume attachment record: 42dce34d-a725-459d-9faf-f052d4783cbb _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.917 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:19:07 np0005588919 nova_compute[225855]: 2026-01-20 15:19:07.950 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.013 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.013 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.014 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.057 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.876 225859 DEBUG nova.objects.instance [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.899 225859 DEBUG nova.virt.libvirt.driver [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Attempting to attach volume 9f3cabb8-d51f-4db9-97b3-c5b764893ee2 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:19:08 np0005588919 nova_compute[225855]: 2026-01-20 15:19:08.902 225859 DEBUG nova.virt.libvirt.guest [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-9f3cabb8-d51f-4db9-97b3-c5b764893ee2">
Jan 20 10:19:08 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 10:19:08 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  </auth>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:19:08 np0005588919 nova_compute[225855]:  <serial>9f3cabb8-d51f-4db9-97b3-c5b764893ee2</serial>
Jan 20 10:19:08 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:19:08 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:19:09 np0005588919 nova_compute[225855]: 2026-01-20 15:19:09.026 225859 DEBUG nova.virt.libvirt.driver [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:19:09 np0005588919 nova_compute[225855]: 2026-01-20 15:19:09.026 225859 DEBUG nova.virt.libvirt.driver [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:19:09 np0005588919 nova_compute[225855]: 2026-01-20 15:19:09.027 225859 DEBUG nova.virt.libvirt.driver [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:19:09 np0005588919 nova_compute[225855]: 2026-01-20 15:19:09.027 225859 DEBUG nova.virt.libvirt.driver [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:44:23:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:19:09 np0005588919 nova_compute[225855]: 2026-01-20 15:19:09.303 225859 DEBUG oslo_concurrency.lockutils [None req-1efd3742-3065-40d0-a299-80c5de58551f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:09.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:09.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:11.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:11.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:19:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:19:12 np0005588919 nova_compute[225855]: 2026-01-20 15:19:12.091 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:13 np0005588919 nova_compute[225855]: 2026-01-20 15:19:13.038 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:13 np0005588919 nova_compute[225855]: 2026-01-20 15:19:13.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:13.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:13.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:19:15Z|00869|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:19:15 np0005588919 nova_compute[225855]: 2026-01-20 15:19:15.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:15.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:15.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:16.440 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:16.440 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:16.441 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:17 np0005588919 nova_compute[225855]: 2026-01-20 15:19:17.119 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:17.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:17.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:18 np0005588919 nova_compute[225855]: 2026-01-20 15:19:18.060 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:19:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:19.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:19.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:20 np0005588919 podman[308137]: 2026-01-20 15:19:20.050845247 +0000 UTC m=+0.085896795 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:19:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:21.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:21.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:22 np0005588919 nova_compute[225855]: 2026-01-20 15:19:22.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:22 np0005588919 nova_compute[225855]: 2026-01-20 15:19:22.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:23 np0005588919 nova_compute[225855]: 2026-01-20 15:19:23.062 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:23.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:23.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:25.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:25.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:26 np0005588919 nova_compute[225855]: 2026-01-20 15:19:26.484 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:27 np0005588919 nova_compute[225855]: 2026-01-20 15:19:27.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:27.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:27.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:28 np0005588919 nova_compute[225855]: 2026-01-20 15:19:28.063 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:29 np0005588919 nova_compute[225855]: 2026-01-20 15:19:29.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:29 np0005588919 nova_compute[225855]: 2026-01-20 15:19:29.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:19:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:29.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:29.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:31.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:31.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:32 np0005588919 nova_compute[225855]: 2026-01-20 15:19:32.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:33 np0005588919 nova_compute[225855]: 2026-01-20 15:19:33.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:33.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:33.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:33 np0005588919 systemd[1]: Starting dnf makecache...
Jan 20 10:19:34 np0005588919 podman[308221]: 2026-01-20 15:19:34.043145881 +0000 UTC m=+0.082358245 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:19:34 np0005588919 dnf[308222]: Metadata cache refreshed recently.
Jan 20 10:19:34 np0005588919 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 20 10:19:34 np0005588919 systemd[1]: Finished dnf makecache.
Jan 20 10:19:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:35.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:35.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:37 np0005588919 nova_compute[225855]: 2026-01-20 15:19:37.100 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:37 np0005588919 nova_compute[225855]: 2026-01-20 15:19:37.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:37.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:37.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:37 np0005588919 nova_compute[225855]: 2026-01-20 15:19:37.979 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:38 np0005588919 nova_compute[225855]: 2026-01-20 15:19:38.108 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:39 np0005588919 nova_compute[225855]: 2026-01-20 15:19:39.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:39 np0005588919 nova_compute[225855]: 2026-01-20 15:19:39.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:19:39 np0005588919 nova_compute[225855]: 2026-01-20 15:19:39.399 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:19:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:39.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:39.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:41.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:41.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:42 np0005588919 nova_compute[225855]: 2026-01-20 15:19:42.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:43 np0005588919 nova_compute[225855]: 2026-01-20 15:19:43.112 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:43.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:43.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:45.197 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:19:45 np0005588919 nova_compute[225855]: 2026-01-20 15:19:45.197 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:45.198 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:19:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:45.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:45.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:45 np0005588919 nova_compute[225855]: 2026-01-20 15:19:45.947 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:45 np0005588919 nova_compute[225855]: 2026-01-20 15:19:45.974 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Triggering sync for uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 20 10:19:45 np0005588919 nova_compute[225855]: 2026-01-20 15:19:45.974 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:19:45 np0005588919 nova_compute[225855]: 2026-01-20 15:19:45.974 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:19:46 np0005588919 nova_compute[225855]: 2026-01-20 15:19:46.170 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:19:47 np0005588919 nova_compute[225855]: 2026-01-20 15:19:47.128 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:47.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:47.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:48 np0005588919 nova_compute[225855]: 2026-01-20 15:19:48.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:49.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:49.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:51 np0005588919 podman[308300]: 2026-01-20 15:19:51.039705312 +0000 UTC m=+0.083154038 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:19:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:19:51.199 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:19:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:19:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:51.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:19:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:51.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:52 np0005588919 nova_compute[225855]: 2026-01-20 15:19:52.129 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:53 np0005588919 nova_compute[225855]: 2026-01-20 15:19:53.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:53.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:53.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:55.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:19:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:55.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:19:57 np0005588919 nova_compute[225855]: 2026-01-20 15:19:57.131 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:57.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:57.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:58 np0005588919 nova_compute[225855]: 2026-01-20 15:19:58.125 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:19:58 np0005588919 nova_compute[225855]: 2026-01-20 15:19:58.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:19:58 np0005588919 nova_compute[225855]: 2026-01-20 15:19:58.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:19:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:19:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:19:59.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:19:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:19:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:19:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:19:59.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:20:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:01.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:01.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:02 np0005588919 nova_compute[225855]: 2026-01-20 15:20:02.133 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.127 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.231 225859 DEBUG oslo_concurrency.lockutils [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.232 225859 DEBUG oslo_concurrency.lockutils [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.246 225859 INFO nova.compute.manager [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Detaching volume 9f3cabb8-d51f-4db9-97b3-c5b764893ee2#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:20:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.418 225859 INFO nova.virt.block_device [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Attempting to driver detach volume 9f3cabb8-d51f-4db9-97b3-c5b764893ee2 from mountpoint /dev/vdb#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.427 225859 DEBUG nova.virt.libvirt.driver [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Attempting to detach device vdb from instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.427 225859 DEBUG nova.virt.libvirt.guest [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-9f3cabb8-d51f-4db9-97b3-c5b764893ee2">
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <serial>9f3cabb8-d51f-4db9-97b3-c5b764893ee2</serial>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:20:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.434 225859 INFO nova.virt.libvirt.driver [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 from the persistent domain config.#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.435 225859 DEBUG nova.virt.libvirt.driver [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.435 225859 DEBUG nova.virt.libvirt.guest [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-9f3cabb8-d51f-4db9-97b3-c5b764893ee2">
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <serial>9f3cabb8-d51f-4db9-97b3-c5b764893ee2</serial>
Jan 20 10:20:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:20:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:20:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:20:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:03.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.485 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768922403.4855456, 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.488 225859 DEBUG nova.virt.libvirt.driver [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.490 225859 INFO nova.virt.libvirt.driver [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 from the live domain config.#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.527 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.527 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.528 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.528 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.642 225859 DEBUG nova.objects.instance [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:03 np0005588919 nova_compute[225855]: 2026-01-20 15:20:03.686 225859 DEBUG oslo_concurrency.lockutils [None req-5ff466c2-d750-4ab2-aa61-5cf4fd377c32 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:03.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.680 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.681 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.681 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.682 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.682 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.683 225859 INFO nova.compute.manager [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Terminating instance#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.684 225859 DEBUG nova.compute.manager [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.727 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [{"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:04 np0005588919 kernel: tapbd99d3a5-54 (unregistering): left promiscuous mode
Jan 20 10:20:04 np0005588919 NetworkManager[49104]: <info>  [1768922404.7457] device (tapbd99d3a5-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.746 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.747 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.747 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.753 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:04Z|00870|binding|INFO|Releasing lport bd99d3a5-54e0-4e70-9a02-3543631281a6 from this chassis (sb_readonly=0)
Jan 20 10:20:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:04Z|00871|binding|INFO|Setting lport bd99d3a5-54e0-4e70-9a02-3543631281a6 down in Southbound
Jan 20 10:20:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:04Z|00872|binding|INFO|Removing iface tapbd99d3a5-54 ovn-installed in OVS
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.755 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:04.761 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:23:fd 10.100.0.13'], port_security=['fa:16:3e:44:23:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2b31f3d7-81bd-4712-bcb1-98afd2dc0f44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd945b282-623c-4da9-a940-ac04c971b57b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=bd99d3a5-54e0-4e70-9a02-3543631281a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:04.762 140354 INFO neutron.agent.ovn.metadata.agent [-] Port bd99d3a5-54e0-4e70-9a02-3543631281a6 in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a unbound from our chassis#033[00m
Jan 20 10:20:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:04.763 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:20:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:04.764 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a9904c-4e6c-4841-b192-6be83a8b83b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:04.764 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace which is not needed anymore#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:04 np0005588919 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000c2.scope: Deactivated successfully.
Jan 20 10:20:04 np0005588919 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000c2.scope: Consumed 15.860s CPU time.
Jan 20 10:20:04 np0005588919 systemd-machined[194361]: Machine qemu-102-instance-000000c2 terminated.
Jan 20 10:20:04 np0005588919 podman[308385]: 2026-01-20 15:20:04.831701021 +0000 UTC m=+0.062384312 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [NOTICE]   (307756) : haproxy version is 2.8.14-c23fe91
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [NOTICE]   (307756) : path to executable is /usr/sbin/haproxy
Jan 20 10:20:04 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [WARNING]  (307756) : Exiting Master process...
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [WARNING]  (307756) : Exiting Master process...
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [ALERT]    (307756) : Current worker (307758) exited with code 143 (Terminated)
Jan 20 10:20:04 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[307752]: [WARNING]  (307756) : All workers exited. Exiting... (0)
Jan 20 10:20:04 np0005588919 systemd[1]: libpod-5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7.scope: Deactivated successfully.
Jan 20 10:20:04 np0005588919 podman[308428]: 2026-01-20 15:20:04.9213396 +0000 UTC m=+0.056716101 container died 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.923 225859 INFO nova.virt.libvirt.driver [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Instance destroyed successfully.#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.925 225859 DEBUG nova.objects.instance [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'resources' on Instance uuid 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.940 225859 DEBUG nova.virt.libvirt.vif [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:18:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1816292644',display_name='tempest-AttachVolumeNegativeTest-server-1816292644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1816292644',id=194,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGw8qLSULQliDvwKbeTpbt4J6LHSzE0FcgbAkXI3mB449DNtZV5vtYZtKqW3qflHvvMvcmL7nd1rBiXHEUgRgW71fE/QnzR597lXioriSvOlFWdxXwkMYduhCAWqw/sG6A==',key_name='tempest-keypair-235858913',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:18:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-acjnzwzw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:18:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=2b31f3d7-81bd-4712-bcb1-98afd2dc0f44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.941 225859 DEBUG nova.network.os_vif_util [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "address": "fa:16:3e:44:23:fd", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd99d3a5-54", "ovs_interfaceid": "bd99d3a5-54e0-4e70-9a02-3543631281a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.941 225859 DEBUG nova.network.os_vif_util [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.942 225859 DEBUG os_vif [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.943 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.944 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd99d3a5-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:20:04 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7-userdata-shm.mount: Deactivated successfully.
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.956 225859 INFO os_vif [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:23:fd,bridge_name='br-int',has_traffic_filtering=True,id=bd99d3a5-54e0-4e70-9a02-3543631281a6,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd99d3a5-54')#033[00m
Jan 20 10:20:04 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c7b4fcd771e27f8b153744804f05a30b7615c256cc9d63d2f67498c5cbee4f4e-merged.mount: Deactivated successfully.
Jan 20 10:20:04 np0005588919 podman[308428]: 2026-01-20 15:20:04.970750555 +0000 UTC m=+0.106127046 container cleanup 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:20:04 np0005588919 systemd[1]: libpod-conmon-5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7.scope: Deactivated successfully.
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.991 225859 DEBUG nova.compute.manager [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-unplugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.993 225859 DEBUG oslo_concurrency.lockutils [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.993 225859 DEBUG oslo_concurrency.lockutils [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.993 225859 DEBUG oslo_concurrency.lockutils [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.994 225859 DEBUG nova.compute.manager [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] No waiting events found dispatching network-vif-unplugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:04 np0005588919 nova_compute[225855]: 2026-01-20 15:20:04.994 225859 DEBUG nova.compute.manager [req-ba1685a9-2c37-4c38-99e8-04a85ac49bbb req-18920763-1f55-4f81-bd55-bef55d14de7d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-unplugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:20:05 np0005588919 podman[308483]: 2026-01-20 15:20:05.05100571 +0000 UTC m=+0.052069841 container remove 5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.058 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[70f8056e-6ac5-4b54-8257-b48293301d39]: (4, ('Tue Jan 20 03:20:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7)\n5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7\nTue Jan 20 03:20:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7)\n5e4fd4628195f875a51769944f5674e810c100ec290dc3180351b0db60def7c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.061 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b99d362-a7da-4f38-b19a-0f2dec007acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.062 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.063 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:05 np0005588919 kernel: tap76c2d716-70: left promiscuous mode
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.081 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.084 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce970db-1b12-486e-9076-f990cfaf89eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.103 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b147ebf6-bb11-42dd-9dbf-ab6513831d05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[75402418-e903-4422-9ccd-34aedf082408]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.126 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcaba9d-c8ef-4cd9-91a5-171964438c27]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736298, 'reachable_time': 24698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308501, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.130 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:20:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:05.131 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[49da201e-5157-4b60-91e5-9543a63d5fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:05 np0005588919 systemd[1]: run-netns-ovnmeta\x2d76c2d716\x2d7d14\x2d4bc1\x2db83b\x2da3290ee99d9a.mount: Deactivated successfully.
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.384 225859 INFO nova.virt.libvirt.driver [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Deleting instance files /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_del#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.385 225859 INFO nova.virt.libvirt.driver [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Deletion of /var/lib/nova/instances/2b31f3d7-81bd-4712-bcb1-98afd2dc0f44_del complete#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.433 225859 INFO nova.compute.manager [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.434 225859 DEBUG oslo.service.loopingcall [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.435 225859 DEBUG nova.compute.manager [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:20:05 np0005588919 nova_compute[225855]: 2026-01-20 15:20:05.435 225859 DEBUG nova.network.neutron [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:20:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:05.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.370 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.371 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.371 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.372 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.372 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.582 225859 DEBUG nova.network.neutron [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.619 225859 INFO nova.compute.manager [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Took 1.18 seconds to deallocate network for instance.#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.671 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.672 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.735 225859 DEBUG oslo_concurrency.processutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:06 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:06 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/35866287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:06 np0005588919 nova_compute[225855]: 2026-01-20 15:20:06.856 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.037 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.038 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4254MB free_disk=20.931278228759766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.038 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.076 225859 DEBUG nova.compute.manager [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.076 225859 DEBUG oslo_concurrency.lockutils [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.077 225859 DEBUG oslo_concurrency.lockutils [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.077 225859 DEBUG oslo_concurrency.lockutils [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.077 225859 DEBUG nova.compute.manager [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] No waiting events found dispatching network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.077 225859 WARNING nova.compute.manager [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received unexpected event network-vif-plugged-bd99d3a5-54e0-4e70-9a02-3543631281a6 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.078 225859 DEBUG nova.compute.manager [req-c8e349d5-1f0d-4dd9-9c94-2189fd37d746 req-313ac06f-a95a-42bf-8416-0fa87d3ccc00 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Received event network-vif-deleted-bd99d3a5-54e0-4e70-9a02-3543631281a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3825334878' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.230 225859 DEBUG oslo_concurrency.processutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.237 225859 DEBUG nova.compute.provider_tree [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.256 225859 DEBUG nova.scheduler.client.report [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.285 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.289 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.327 225859 INFO nova.scheduler.client.report [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Deleted allocations for instance 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.361 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.362 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.381 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.411 225859 DEBUG oslo_concurrency.lockutils [None req-77153b7a-08dd-4f41-958b-db394c309e07 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "2b31f3d7-81bd-4712-bcb1-98afd2dc0f44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:07.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/703266583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.833 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.838 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.855 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.877 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:20:07 np0005588919 nova_compute[225855]: 2026-01-20 15:20:07.878 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:07.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:08 np0005588919 nova_compute[225855]: 2026-01-20 15:20:08.173 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.866512) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408866643, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 1220, "num_deletes": 251, "total_data_size": 2591421, "memory_usage": 2620800, "flush_reason": "Manual Compaction"}
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 20 10:20:08 np0005588919 nova_compute[225855]: 2026-01-20 15:20:08.876 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408896088, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 1698431, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70250, "largest_seqno": 71464, "table_properties": {"data_size": 1693229, "index_size": 2661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11553, "raw_average_key_size": 19, "raw_value_size": 1682677, "raw_average_value_size": 2891, "num_data_blocks": 118, "num_entries": 582, "num_filter_entries": 582, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922312, "oldest_key_time": 1768922312, "file_creation_time": 1768922408, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 29735 microseconds, and 6477 cpu microseconds.
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.896254) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 1698431 bytes OK
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.896284) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.902794) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.902824) EVENT_LOG_v1 {"time_micros": 1768922408902814, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.902853) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 2585594, prev total WAL file size 2585594, number of live WAL files 2.
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.904272) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(1658KB)], [141(12MB)]
Jan 20 10:20:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922408904326, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 15213051, "oldest_snapshot_seqno": -1}
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 9464 keys, 13363564 bytes, temperature: kUnknown
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409199265, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 13363564, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13300247, "index_size": 38564, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23685, "raw_key_size": 249234, "raw_average_key_size": 26, "raw_value_size": 13131911, "raw_average_value_size": 1387, "num_data_blocks": 1475, "num_entries": 9464, "num_filter_entries": 9464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922408, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.199600) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 13363564 bytes
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.341208) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 51.6 rd, 45.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.9 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.8) write-amplify(7.9) OK, records in: 9979, records dropped: 515 output_compression: NoCompression
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.341275) EVENT_LOG_v1 {"time_micros": 1768922409341249, "job": 90, "event": "compaction_finished", "compaction_time_micros": 295025, "compaction_time_cpu_micros": 33737, "output_level": 6, "num_output_files": 1, "total_output_size": 13363564, "num_input_records": 9979, "num_output_records": 9464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409342141, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922409347473, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:08.904193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.347624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.347631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.347632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.347634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:20:09.347636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:20:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:09.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:09 np0005588919 nova_compute[225855]: 2026-01-20 15:20:09.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:09.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:11.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:11.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.107 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.107 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.125 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.195 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.196 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.200 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.200 225859 INFO nova.compute.claims [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.297 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:20:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2383700053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.729 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.736 225859 DEBUG nova.compute.provider_tree [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.755 225859 DEBUG nova.scheduler.client.report [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.778 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.779 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.833 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.834 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.856 225859 INFO nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:20:13 np0005588919 nova_compute[225855]: 2026-01-20 15:20:13.883 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:20:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:13.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.030 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.031 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.031 225859 INFO nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Creating image(s)#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.056 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.086 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.112 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.117 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.182 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.184 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.185 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.185 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.219 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.223 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.384 225859 DEBUG nova.policy [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd9a8f26b71f4631a387e555e6b18428', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9156c0a9920c4721843416b9a44404f9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.505 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.282s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.585 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] resizing rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.685 225859 DEBUG nova.objects.instance [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.703 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.704 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Ensure instance console log exists: /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.704 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.705 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.705 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:14 np0005588919 nova_compute[225855]: 2026-01-20 15:20:14.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:15.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:15.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:15 np0005588919 nova_compute[225855]: 2026-01-20 15:20:15.957 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Successfully created port: ab5264b7-ec64-46dd-b30d-981799387571 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:20:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:16.441 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:16.441 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:16.442 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:17.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.740 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Successfully updated port: ab5264b7-ec64-46dd-b30d-981799387571 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.755 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.756 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquired lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.756 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.891 225859 DEBUG nova.compute.manager [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-changed-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.891 225859 DEBUG nova.compute.manager [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Refreshing instance network info cache due to event network-changed-ab5264b7-ec64-46dd-b30d-981799387571. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.891 225859 DEBUG oslo_concurrency.lockutils [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:17.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:17 np0005588919 nova_compute[225855]: 2026-01-20 15:20:17.959 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:20:18 np0005588919 nova_compute[225855]: 2026-01-20 15:20:18.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:19.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:20:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:19 np0005588919 nova_compute[225855]: 2026-01-20 15:20:19.918 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922404.9171386, 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:19 np0005588919 nova_compute[225855]: 2026-01-20 15:20:19.919 225859 INFO nova.compute.manager [-] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:20:19 np0005588919 nova_compute[225855]: 2026-01-20 15:20:19.951 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:19.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:19 np0005588919 nova_compute[225855]: 2026-01-20 15:20:19.993 225859 DEBUG nova.compute.manager [None req-dc2bf381-1c1c-486d-bcc0-446eaf6c8ccc - - - - - -] [instance: 2b31f3d7-81bd-4712-bcb1-98afd2dc0f44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.217 225859 DEBUG nova.network.neutron [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating instance_info_cache with network_info: [{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.239 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Releasing lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.240 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Instance network_info: |[{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.240 225859 DEBUG oslo_concurrency.lockutils [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.241 225859 DEBUG nova.network.neutron [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Refreshing network info cache for port ab5264b7-ec64-46dd-b30d-981799387571 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.244 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Start _get_guest_xml network_info=[{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.247 225859 WARNING nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.252 225859 DEBUG nova.virt.libvirt.host [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.253 225859 DEBUG nova.virt.libvirt.host [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.256 225859 DEBUG nova.virt.libvirt.host [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.257 225859 DEBUG nova.virt.libvirt.host [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.258 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.258 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.258 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.259 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.259 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.259 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.259 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.259 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.260 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.260 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.260 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.260 225859 DEBUG nova.virt.hardware [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.263 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:20:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3496344543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.742 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.770 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:20 np0005588919 nova_compute[225855]: 2026-01-20 15:20:20.774 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:20:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/243034538' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.233 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.235 225859 DEBUG nova.virt.libvirt.vif [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-512186019',display_name='tempest-AttachVolumeNegativeTest-server-512186019',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-512186019',id=197,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8d6pfZzu0QN0+ud36hsYGEa2fue/k/EBJ/5AAbAw966Nprd6b6gecK+XPS3vJw5O7JCevyXRxpx1xed28ouQO1W8vY3Q7SPAOn3X0ewiZY79+ulj2hj305nyB4SNFMjQ==',key_name='tempest-keypair-2087250418',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-1dq7a0u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:20:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=7a53c9b1-e64b-4a31-897a-bbe7d964cf45,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.236 225859 DEBUG nova.network.os_vif_util [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.237 225859 DEBUG nova.network.os_vif_util [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.238 225859 DEBUG nova.objects.instance [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.268 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <uuid>7a53c9b1-e64b-4a31-897a-bbe7d964cf45</uuid>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <name>instance-000000c5</name>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:name>tempest-AttachVolumeNegativeTest-server-512186019</nova:name>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:20:20</nova:creationTime>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:user uuid="cd9a8f26b71f4631a387e555e6b18428">tempest-AttachVolumeNegativeTest-1505789262-project-member</nova:user>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:project uuid="9156c0a9920c4721843416b9a44404f9">tempest-AttachVolumeNegativeTest-1505789262</nova:project>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <nova:port uuid="ab5264b7-ec64-46dd-b30d-981799387571">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="serial">7a53c9b1-e64b-4a31-897a-bbe7d964cf45</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="uuid">7a53c9b1-e64b-4a31-897a-bbe7d964cf45</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:3c:70:df"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <target dev="tapab5264b7-ec"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/console.log" append="off"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:20:21 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:20:21 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:20:21 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:20:21 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.269 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Preparing to wait for external event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.270 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.270 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.270 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.271 225859 DEBUG nova.virt.libvirt.vif [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-512186019',display_name='tempest-AttachVolumeNegativeTest-server-512186019',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-512186019',id=197,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8d6pfZzu0QN0+ud36hsYGEa2fue/k/EBJ/5AAbAw966Nprd6b6gecK+XPS3vJw5O7JCevyXRxpx1xed28ouQO1W8vY3Q7SPAOn3X0ewiZY79+ulj2hj305nyB4SNFMjQ==',key_name='tempest-keypair-2087250418',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-1dq7a0u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:20:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=7a53c9b1-e64b-4a31-897a-bbe7d964cf45,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.271 225859 DEBUG nova.network.os_vif_util [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.272 225859 DEBUG nova.network.os_vif_util [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.272 225859 DEBUG os_vif [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.273 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.274 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.278 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab5264b7-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.278 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab5264b7-ec, col_values=(('external_ids', {'iface-id': 'ab5264b7-ec64-46dd-b30d-981799387571', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:70:df', 'vm-uuid': '7a53c9b1-e64b-4a31-897a-bbe7d964cf45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:21 np0005588919 NetworkManager[49104]: <info>  [1768922421.2811] manager: (tapab5264b7-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.287 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.288 225859 INFO os_vif [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec')#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.440 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.441 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.441 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:3c:70:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.442 225859 INFO nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Using config drive#033[00m
Jan 20 10:20:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:21.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:21 np0005588919 nova_compute[225855]: 2026-01-20 15:20:21.493 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:21.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.083 225859 INFO nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Creating config drive at /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.094 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0_t7c0s3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:22 np0005588919 podman[309097]: 2026-01-20 15:20:22.126775497 +0000 UTC m=+0.154069308 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.247 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0_t7c0s3" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.274 225859 DEBUG nova.storage.rbd_utils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] rbd image 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.278 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.427 225859 DEBUG oslo_concurrency.processutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config 7a53c9b1-e64b-4a31-897a-bbe7d964cf45_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.427 225859 INFO nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Deleting local config drive /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45/disk.config because it was imported into RBD.#033[00m
Jan 20 10:20:22 np0005588919 kernel: tapab5264b7-ec: entered promiscuous mode
Jan 20 10:20:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:22Z|00873|binding|INFO|Claiming lport ab5264b7-ec64-46dd-b30d-981799387571 for this chassis.
Jan 20 10:20:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:22Z|00874|binding|INFO|ab5264b7-ec64-46dd-b30d-981799387571: Claiming fa:16:3e:3c:70:df 10.100.0.6
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.481 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.4826] manager: (tapab5264b7-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/365)
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.494 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:70:df 10.100.0.6'], port_security=['fa:16:3e:3c:70:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7a53c9b1-e64b-4a31-897a-bbe7d964cf45', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77f773a6-dc7f-4790-9c9b-d69f30c72eb2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ab5264b7-ec64-46dd-b30d-981799387571) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:22Z|00875|binding|INFO|Setting lport ab5264b7-ec64-46dd-b30d-981799387571 ovn-installed in OVS
Jan 20 10:20:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:22Z|00876|binding|INFO|Setting lport ab5264b7-ec64-46dd-b30d-981799387571 up in Southbound
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.495 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ab5264b7-ec64-46dd-b30d-981799387571 in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a bound to our chassis#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.496 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.496 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.499 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.507 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9f18d1c1-9f3f-48c4-8d43-aa858af68c70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.508 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap76c2d716-71 in ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.510 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap76c2d716-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.510 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f22ee9-0d74-4731-8778-f1f4114c5276]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.511 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd28157-74ac-4a48-a3ad-436b3822fa35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 systemd-udevd[309178]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:20:22 np0005588919 systemd-machined[194361]: New machine qemu-103-instance-000000c5.
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.522 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[2e99f971-bd02-4233-9190-22280d65da73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.5269] device (tapab5264b7-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.5280] device (tapab5264b7-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:20:22 np0005588919 systemd[1]: Started Virtual Machine qemu-103-instance-000000c5.
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.546 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4905c6-d8e3-48e8-96a3-0b41e671eb71]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.575 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cdaa1cb6-ef52-45ae-970a-30fd1c7782aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.5812] manager: (tap76c2d716-70): new Veth device (/org/freedesktop/NetworkManager/Devices/366)
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.580 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1e74cc-6806-4b57-ba4e-1f0b455020e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.610 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc9986b-3e25-4151-8c0f-af3f88629015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.614 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[66a304e7-3cee-4330-83d8-6e70eb78a425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.6367] device (tap76c2d716-70): carrier: link connected
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.642 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ada54852-843a-48a0-b26a-ce3a849f12e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.660 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c4aa291b-4a90-40d8-abee-eacae454bc63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746156, 'reachable_time': 15011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309210, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.677 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6a01f89a-5c2b-4d25-b86c-ef13ce03f4d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:44ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746156, 'tstamp': 746156}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309211, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.696 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c83a8de1-305f-4f57-8458-5139d042997e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76c2d716-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:44:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746156, 'reachable_time': 15011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309212, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.724 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62883f93-6583-48b8-ab03-f996adaea3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.781 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62160ad2-ea8e-46a4-9b88-28cae7061c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.782 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.783 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.783 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76c2d716-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 NetworkManager[49104]: <info>  [1768922422.7861] manager: (tap76c2d716-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 20 10:20:22 np0005588919 kernel: tap76c2d716-70: entered promiscuous mode
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.791 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76c2d716-70, col_values=(('external_ids', {'iface-id': '2c0bba0e-e9b6-4ece-8349-62642b94d91d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:22Z|00877|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.793 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.794 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2b922365-37e2-4d99-8007-0d96abbf9e32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.795 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.pid.haproxy
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 76c2d716-7d14-4bc1-b83b-a3290ee99d9a
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:20:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:22.795 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'env', 'PROCESS_TAG=haproxy-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/76c2d716-7d14-4bc1-b83b-a3290ee99d9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.806 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.960 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922422.9596643, 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.961 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] VM Started (Lifecycle Event)#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.991 225859 DEBUG nova.compute.manager [req-f5ff359d-800f-4750-b361-4c2db76cfb9e req-a3f10dba-d01a-4290-a0b0-2a98eff7ad84 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.992 225859 DEBUG oslo_concurrency.lockutils [req-f5ff359d-800f-4750-b361-4c2db76cfb9e req-a3f10dba-d01a-4290-a0b0-2a98eff7ad84 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.992 225859 DEBUG oslo_concurrency.lockutils [req-f5ff359d-800f-4750-b361-4c2db76cfb9e req-a3f10dba-d01a-4290-a0b0-2a98eff7ad84 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.992 225859 DEBUG oslo_concurrency.lockutils [req-f5ff359d-800f-4750-b361-4c2db76cfb9e req-a3f10dba-d01a-4290-a0b0-2a98eff7ad84 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.993 225859 DEBUG nova.compute.manager [req-f5ff359d-800f-4750-b361-4c2db76cfb9e req-a3f10dba-d01a-4290-a0b0-2a98eff7ad84 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Processing event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.993 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:20:22 np0005588919 nova_compute[225855]: 2026-01-20 15:20:22.997 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.002 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Instance spawned successfully.#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.002 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.006 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.009 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.039 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.041 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.041 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.042 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.043 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.043 225859 DEBUG nova.virt.libvirt.driver [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.047 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.047 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922422.9599028, 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.048 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.102 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.107 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922422.997031, 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.108 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:20:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:23.124 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.139 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.143 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.147 225859 INFO nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Took 9.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.147 225859 DEBUG nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.174 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.178 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.179 225859 DEBUG nova.network.neutron [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updated VIF entry in instance network info cache for port ab5264b7-ec64-46dd-b30d-981799387571. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.180 225859 DEBUG nova.network.neutron [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating instance_info_cache with network_info: [{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.199 225859 DEBUG oslo_concurrency.lockutils [req-5a11617e-8c11-4005-9cf8-213fb71664d2 req-0f4ffa8f-cb27-4e7c-a4af-33152b5cf64c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.234 225859 INFO nova.compute.manager [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Took 10.06 seconds to build instance.#033[00m
Jan 20 10:20:23 np0005588919 nova_compute[225855]: 2026-01-20 15:20:23.259 225859 DEBUG oslo_concurrency.lockutils [None req-dc9a0afd-b1a5-4c8b-b4fc-1789229ae1df cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:23 np0005588919 podman[309286]: 2026-01-20 15:20:23.262084045 +0000 UTC m=+0.058306026 container create ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:20:23 np0005588919 systemd[1]: Started libpod-conmon-ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51.scope.
Jan 20 10:20:23 np0005588919 podman[309286]: 2026-01-20 15:20:23.228681893 +0000 UTC m=+0.024903884 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:20:23 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:20:23 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98994e09dc4916b2814fcd1aedf4c05e63267fe55d9759fece6cec7531df8589/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:20:23 np0005588919 podman[309286]: 2026-01-20 15:20:23.34979318 +0000 UTC m=+0.146015171 container init ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:20:23 np0005588919 podman[309286]: 2026-01-20 15:20:23.357808766 +0000 UTC m=+0.154030747 container start ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 10:20:23 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [NOTICE]   (309305) : New worker (309307) forked
Jan 20 10:20:23 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [NOTICE]   (309305) : Loading success.
Jan 20 10:20:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:23 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:23.416 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:20:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:23.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:24 np0005588919 nova_compute[225855]: 2026-01-20 15:20:24.842 225859 DEBUG nova.compute.manager [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-changed-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:24 np0005588919 nova_compute[225855]: 2026-01-20 15:20:24.842 225859 DEBUG nova.compute.manager [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Refreshing instance network info cache due to event network-changed-ab5264b7-ec64-46dd-b30d-981799387571. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:20:24 np0005588919 nova_compute[225855]: 2026-01-20 15:20:24.843 225859 DEBUG oslo_concurrency.lockutils [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:20:24 np0005588919 nova_compute[225855]: 2026-01-20 15:20:24.843 225859 DEBUG oslo_concurrency.lockutils [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:20:24 np0005588919 nova_compute[225855]: 2026-01-20 15:20:24.844 225859 DEBUG nova.network.neutron [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Refreshing network info cache for port ab5264b7-ec64-46dd-b30d-981799387571 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.108 225859 DEBUG nova.compute.manager [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.108 225859 DEBUG oslo_concurrency.lockutils [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.109 225859 DEBUG oslo_concurrency.lockutils [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.109 225859 DEBUG oslo_concurrency.lockutils [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.109 225859 DEBUG nova.compute.manager [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] No waiting events found dispatching network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:20:25 np0005588919 nova_compute[225855]: 2026-01-20 15:20:25.109 225859 WARNING nova.compute.manager [req-116f9fc1-2df7-43ea-937a-9f5c96459625 req-4ce11330-6966-4750-a9fd-7d5a0752afbb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received unexpected event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:20:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:25.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:25.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:26 np0005588919 nova_compute[225855]: 2026-01-20 15:20:26.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:20:26 np0005588919 nova_compute[225855]: 2026-01-20 15:20:26.960 225859 DEBUG nova.network.neutron [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updated VIF entry in instance network info cache for port ab5264b7-ec64-46dd-b30d-981799387571. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:20:26 np0005588919 nova_compute[225855]: 2026-01-20 15:20:26.961 225859 DEBUG nova.network.neutron [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating instance_info_cache with network_info: [{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:20:26 np0005588919 nova_compute[225855]: 2026-01-20 15:20:26.986 225859 DEBUG oslo_concurrency.lockutils [req-d86ea6cb-fab0-41be-9624-427a3a4a8836 req-3fe23a3a-e437-4c8b-89de-e227d8d5dc1c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:20:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:27.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:28 np0005588919 nova_compute[225855]: 2026-01-20 15:20:28.211 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:20:29.418 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:20:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:29.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:29.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:31 np0005588919 nova_compute[225855]: 2026-01-20 15:20:31.285 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:31.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:31.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:33 np0005588919 nova_compute[225855]: 2026-01-20 15:20:33.214 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:33.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:33.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:35 np0005588919 podman[309422]: 2026-01-20 15:20:35.047774239 +0000 UTC m=+0.081182782 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 20 10:20:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:35.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:35Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3c:70:df 10.100.0.6
Jan 20 10:20:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:35Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3c:70:df 10.100.0.6
Jan 20 10:20:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:35.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:36 np0005588919 nova_compute[225855]: 2026-01-20 15:20:36.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:37.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:38.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:38 np0005588919 nova_compute[225855]: 2026-01-20 15:20:38.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:39.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:40.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:41 np0005588919 nova_compute[225855]: 2026-01-20 15:20:41.292 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:41.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:42.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:43 np0005588919 nova_compute[225855]: 2026-01-20 15:20:43.218 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:43.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:44.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:45.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:46.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:46 np0005588919 nova_compute[225855]: 2026-01-20 15:20:46.295 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:47.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:48.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:48 np0005588919 nova_compute[225855]: 2026-01-20 15:20:48.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:20:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:49.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:20:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:50.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:51 np0005588919 nova_compute[225855]: 2026-01-20 15:20:51.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:51.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:52.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:53 np0005588919 podman[309500]: 2026-01-20 15:20:53.034655219 +0000 UTC m=+0.074405240 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:20:53 np0005588919 nova_compute[225855]: 2026-01-20 15:20:53.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:53.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:20:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:54.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:20:55 np0005588919 ovn_controller[130490]: 2026-01-20T15:20:55Z|00878|binding|INFO|Releasing lport 2c0bba0e-e9b6-4ece-8349-62642b94d91d from this chassis (sb_readonly=0)
Jan 20 10:20:55 np0005588919 nova_compute[225855]: 2026-01-20 15:20:55.395 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:55.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:56.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:56 np0005588919 nova_compute[225855]: 2026-01-20 15:20:56.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:57.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:20:58.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:58 np0005588919 nova_compute[225855]: 2026-01-20 15:20:58.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:20:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.314 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.314 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.369 225859 DEBUG nova.objects.instance [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.482 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:20:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:20:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:20:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:20:59.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.892 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.893 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:20:59 np0005588919 nova_compute[225855]: 2026-01-20 15:20:59.893 225859 INFO nova.compute.manager [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Attaching volume f458dd1d-0a83-4853-b1f9-6b4923a44988 to /dev/vdb#033[00m
Jan 20 10:21:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:00.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.364 225859 DEBUG os_brick.utils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.366 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.376 231081 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.376 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e70ca2-fa21-4e9d-8f1c-33102cab0b8e]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.377 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.383 231081 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.384 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[2a2951ed-ef28-432c-bc11-577ede3ded13]: (4, ('InitiatorName=iqn.1994-05.com.redhat:1821ea3dc03d', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.385 231081 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.391 231081 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.391 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0d8e1c-6b42-434d-a553-b6afc9246238]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.392 231081 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0fdec7-7af7-45b9-a8e2-80eb0db3d9d5]: (4, '870b1f1c-f19c-477b-b282-ee6eeba50974') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.393 225859 DEBUG oslo_concurrency.processutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.419 225859 DEBUG oslo_concurrency.processutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.422 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.422 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.423 225859 DEBUG os_brick.initiator.connectors.lightos [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.423 225859 DEBUG os_brick.utils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] <== get_connector_properties: return (57ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:1821ea3dc03d', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '870b1f1c-f19c-477b-b282-ee6eeba50974', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 20 10:21:00 np0005588919 nova_compute[225855]: 2026-01-20 15:21:00.423 225859 DEBUG nova.virt.block_device [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating existing volume attachment record: dc7fe8a9-25f2-4eb7-8845-f59506394b39 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.141 225859 DEBUG nova.objects.instance [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.186 225859 DEBUG nova.virt.libvirt.driver [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Attempting to attach volume f458dd1d-0a83-4853-b1f9-6b4923a44988 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.191 225859 DEBUG nova.virt.libvirt.guest [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] attach device xml: <disk type="network" device="disk">
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-f458dd1d-0a83-4853-b1f9-6b4923a44988">
Jan 20 10:21:01 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  <auth username="openstack">
Jan 20 10:21:01 np0005588919 nova_compute[225855]:    <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  </auth>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:21:01 np0005588919 nova_compute[225855]:  <serial>f458dd1d-0a83-4853-b1f9-6b4923a44988</serial>
Jan 20 10:21:01 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:21:01 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.301 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.315 225859 DEBUG nova.virt.libvirt.driver [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.316 225859 DEBUG nova.virt.libvirt.driver [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.316 225859 DEBUG nova.virt.libvirt.driver [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.316 225859 DEBUG nova.virt.libvirt.driver [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] No VIF found with MAC fa:16:3e:3c:70:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:21:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:01.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:01 np0005588919 nova_compute[225855]: 2026-01-20 15:21:01.543 225859 DEBUG oslo_concurrency.lockutils [None req-4ceac083-89a3-4964-9803-2c53185bb57f cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.953633) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461953753, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 812, "num_deletes": 250, "total_data_size": 1520801, "memory_usage": 1541048, "flush_reason": "Manual Compaction"}
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461963734, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 697690, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 71469, "largest_seqno": 72276, "table_properties": {"data_size": 694314, "index_size": 1219, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 9080, "raw_average_key_size": 21, "raw_value_size": 687182, "raw_average_value_size": 1590, "num_data_blocks": 52, "num_entries": 432, "num_filter_entries": 432, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922409, "oldest_key_time": 1768922409, "file_creation_time": 1768922461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 10148 microseconds, and 5797 cpu microseconds.
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.963792) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 697690 bytes OK
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.963819) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.965164) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.965187) EVENT_LOG_v1 {"time_micros": 1768922461965179, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.965211) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 1516571, prev total WAL file size 1516571, number of live WAL files 2.
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.966058) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323537' seq:72057594037927935, type:22 .. '6D6772737461740032353038' seq:0, type:0; will stop at (end)
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(681KB)], [144(12MB)]
Jan 20 10:21:01 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922461966142, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 14061254, "oldest_snapshot_seqno": -1}
Jan 20 10:21:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:02.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 9399 keys, 10497953 bytes, temperature: kUnknown
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462068752, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 10497953, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10439140, "index_size": 34188, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 248070, "raw_average_key_size": 26, "raw_value_size": 10276073, "raw_average_value_size": 1093, "num_data_blocks": 1293, "num_entries": 9399, "num_filter_entries": 9399, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.069057) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 10497953 bytes
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.070972) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.9 rd, 102.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.7 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(35.2) write-amplify(15.0) OK, records in: 9896, records dropped: 497 output_compression: NoCompression
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.071031) EVENT_LOG_v1 {"time_micros": 1768922462071008, "job": 92, "event": "compaction_finished", "compaction_time_micros": 102679, "compaction_time_cpu_micros": 29955, "output_level": 6, "num_output_files": 1, "total_output_size": 10497953, "num_input_records": 9896, "num_output_records": 9399, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462071506, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922462074412, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:01.965940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.074540) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.074546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.074548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.074550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:02 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:21:02.074551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.482 225859 DEBUG oslo_concurrency.lockutils [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.484 225859 DEBUG oslo_concurrency.lockutils [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.496 225859 INFO nova.compute.manager [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Detaching volume f458dd1d-0a83-4853-b1f9-6b4923a44988#033[00m
Jan 20 10:21:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:03.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.616 225859 INFO nova.virt.block_device [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Attempting to driver detach volume f458dd1d-0a83-4853-b1f9-6b4923a44988 from mountpoint /dev/vdb#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.624 225859 DEBUG nova.virt.libvirt.driver [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Attempting to detach device vdb from instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.625 225859 DEBUG nova.virt.libvirt.guest [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-f458dd1d-0a83-4853-b1f9-6b4923a44988">
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <serial>f458dd1d-0a83-4853-b1f9-6b4923a44988</serial>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:21:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.634 225859 INFO nova.virt.libvirt.driver [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 from the persistent domain config.#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.635 225859 DEBUG nova.virt.libvirt.driver [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.635 225859 DEBUG nova.virt.libvirt.guest [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <source protocol="rbd" name="volumes/volume-f458dd1d-0a83-4853-b1f9-6b4923a44988">
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.100" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.102" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:    <host name="192.168.122.101" port="6789"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  </source>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <target dev="vdb" bus="virtio"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <serial>f458dd1d-0a83-4853-b1f9-6b4923a44988</serial>
Jan 20 10:21:03 np0005588919 nova_compute[225855]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 20 10:21:03 np0005588919 nova_compute[225855]: </disk>
Jan 20 10:21:03 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.694 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768922463.6939726, 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.696 225859 DEBUG nova.virt.libvirt.driver [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:21:03 np0005588919 nova_compute[225855]: 2026-01-20 15:21:03.699 225859 INFO nova.virt.libvirt.driver [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully detached device vdb from instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 from the live domain config.#033[00m
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.006 225859 DEBUG nova.objects.instance [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'flavor' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.345 225859 DEBUG oslo_concurrency.lockutils [None req-5264b3a9-63fa-429b-bb09-d354dc08e9e9 cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:04 np0005588919 nova_compute[225855]: 2026-01-20 15:21:04.835 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.095 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.096 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.096 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.096 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.446 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.446 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.447 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.447 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.447 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.448 225859 INFO nova.compute.manager [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Terminating instance#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.449 225859 DEBUG nova.compute.manager [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:21:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:05.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:05 np0005588919 kernel: tapab5264b7-ec (unregistering): left promiscuous mode
Jan 20 10:21:05 np0005588919 NetworkManager[49104]: <info>  [1768922465.6530] device (tapab5264b7-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.663 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:21:05Z|00879|binding|INFO|Releasing lport ab5264b7-ec64-46dd-b30d-981799387571 from this chassis (sb_readonly=0)
Jan 20 10:21:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:21:05Z|00880|binding|INFO|Setting lport ab5264b7-ec64-46dd-b30d-981799387571 down in Southbound
Jan 20 10:21:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:21:05Z|00881|binding|INFO|Removing iface tapab5264b7-ec ovn-installed in OVS
Jan 20 10:21:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:05.694 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:70:df 10.100.0.6'], port_security=['fa:16:3e:3c:70:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7a53c9b1-e64b-4a31-897a-bbe7d964cf45', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9156c0a9920c4721843416b9a44404f9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '77f773a6-dc7f-4790-9c9b-d69f30c72eb2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bfc4e2a-eeed-480e-aa18-68fc6c8f2cc2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=ab5264b7-ec64-46dd-b30d-981799387571) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:21:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:05.695 140354 INFO neutron.agent.ovn.metadata.agent [-] Port ab5264b7-ec64-46dd-b30d-981799387571 in datapath 76c2d716-7d14-4bc1-b83b-a3290ee99d9a unbound from our chassis#033[00m
Jan 20 10:21:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:05.696 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:21:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:05.698 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c72d84-6ff5-4ab4-94aa-903432808496]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:05.698 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a namespace which is not needed anymore#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000c5.scope: Deactivated successfully.
Jan 20 10:21:05 np0005588919 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000c5.scope: Consumed 14.588s CPU time.
Jan 20 10:21:05 np0005588919 systemd-machined[194361]: Machine qemu-103-instance-000000c5 terminated.
Jan 20 10:21:05 np0005588919 podman[309611]: 2026-01-20 15:21:05.762791099 +0000 UTC m=+0.089780555 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:21:05 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [NOTICE]   (309305) : haproxy version is 2.8.14-c23fe91
Jan 20 10:21:05 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [NOTICE]   (309305) : path to executable is /usr/sbin/haproxy
Jan 20 10:21:05 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [WARNING]  (309305) : Exiting Master process...
Jan 20 10:21:05 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [ALERT]    (309305) : Current worker (309307) exited with code 143 (Terminated)
Jan 20 10:21:05 np0005588919 neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a[309301]: [WARNING]  (309305) : All workers exited. Exiting... (0)
Jan 20 10:21:05 np0005588919 systemd[1]: libpod-ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51.scope: Deactivated successfully.
Jan 20 10:21:05 np0005588919 podman[309656]: 2026-01-20 15:21:05.843957289 +0000 UTC m=+0.048160090 container died ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:21:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51-userdata-shm.mount: Deactivated successfully.
Jan 20 10:21:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay-98994e09dc4916b2814fcd1aedf4c05e63267fe55d9759fece6cec7531df8589-merged.mount: Deactivated successfully.
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.896 225859 INFO nova.virt.libvirt.driver [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Instance destroyed successfully.#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.897 225859 DEBUG nova.objects.instance [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lazy-loading 'resources' on Instance uuid 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:05 np0005588919 podman[309656]: 2026-01-20 15:21:05.897405977 +0000 UTC m=+0.101608728 container cleanup ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:21:05 np0005588919 systemd[1]: libpod-conmon-ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51.scope: Deactivated successfully.
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.970 225859 DEBUG nova.virt.libvirt.vif [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-512186019',display_name='tempest-AttachVolumeNegativeTest-server-512186019',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-512186019',id=197,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8d6pfZzu0QN0+ud36hsYGEa2fue/k/EBJ/5AAbAw966Nprd6b6gecK+XPS3vJw5O7JCevyXRxpx1xed28ouQO1W8vY3Q7SPAOn3X0ewiZY79+ulj2hj305nyB4SNFMjQ==',key_name='tempest-keypair-2087250418',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:20:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9156c0a9920c4721843416b9a44404f9',ramdisk_id='',reservation_id='r-1dq7a0u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1505789262',owner_user_name='tempest-AttachVolumeNegativeTest-1505789262-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:20:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cd9a8f26b71f4631a387e555e6b18428',uuid=7a53c9b1-e64b-4a31-897a-bbe7d964cf45,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.971 225859 DEBUG nova.network.os_vif_util [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converting VIF {"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.972 225859 DEBUG nova.network.os_vif_util [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.973 225859 DEBUG os_vif [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.975 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.975 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab5264b7-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.977 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:05 np0005588919 nova_compute[225855]: 2026-01-20 15:21:05.981 225859 INFO os_vif [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:70:df,bridge_name='br-int',has_traffic_filtering=True,id=ab5264b7-ec64-46dd-b30d-981799387571,network=Network(76c2d716-7d14-4bc1-b83b-a3290ee99d9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab5264b7-ec')#033[00m
Jan 20 10:21:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:06.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:06 np0005588919 podman[309702]: 2026-01-20 15:21:06.080546945 +0000 UTC m=+0.158215975 container remove ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.086 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d5cf35bd-3d1d-46c8-b96f-b4d396d078ad]: (4, ('Tue Jan 20 03:21:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51)\nff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51\nTue Jan 20 03:21:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a (ff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51)\nff71acd4802f5ef48f8b4d2bd8d4bad2170671407bdfd29bae692881ac810c51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.089 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[04401659-6060-4205-9c42-4fd9cd2a1b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.090 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76c2d716-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.092 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:06 np0005588919 kernel: tap76c2d716-70: left promiscuous mode
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.106 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.109 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d65e16c2-e2cf-47d7-b0bb-dd1bdcb36089]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.123 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a2eddc05-88d9-4c31-9d98-5d8fe501923a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.125 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[41345da5-a637-4a5b-a64f-8bc178087c8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.141 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78cb73ca-fa75-4f97-8496-ca5956755e1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746149, 'reachable_time': 15955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309733, 'error': None, 'target': 'ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.145 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-76c2d716-7d14-4bc1-b83b-a3290ee99d9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:21:06 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:06.145 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[e065bab1-4bde-4f36-a65d-4d65c73c6da3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:21:06 np0005588919 systemd[1]: run-netns-ovnmeta\x2d76c2d716\x2d7d14\x2d4bc1\x2db83b\x2da3290ee99d9a.mount: Deactivated successfully.
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.344 225859 DEBUG nova.compute.manager [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-unplugged-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.345 225859 DEBUG oslo_concurrency.lockutils [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.345 225859 DEBUG oslo_concurrency.lockutils [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.345 225859 DEBUG oslo_concurrency.lockutils [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.345 225859 DEBUG nova.compute.manager [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] No waiting events found dispatching network-vif-unplugged-ab5264b7-ec64-46dd-b30d-981799387571 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.345 225859 DEBUG nova.compute.manager [req-07b90bbf-dcfd-4e60-bc52-f5a1ecb9a4cf req-6a109a83-a898-40d6-ad16-1ebf56c2c436 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-unplugged-ab5264b7-ec64-46dd-b30d-981799387571 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.655 225859 INFO nova.virt.libvirt.driver [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Deleting instance files /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45_del#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.656 225859 INFO nova.virt.libvirt.driver [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Deletion of /var/lib/nova/instances/7a53c9b1-e64b-4a31-897a-bbe7d964cf45_del complete#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.976 225859 INFO nova.compute.manager [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Took 1.53 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.977 225859 DEBUG oslo.service.loopingcall [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.978 225859 DEBUG nova.compute.manager [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:21:06 np0005588919 nova_compute[225855]: 2026-01-20 15:21:06.978 225859 DEBUG nova.network.neutron [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:21:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:07.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.757 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating instance_info_cache with network_info: [{"id": "ab5264b7-ec64-46dd-b30d-981799387571", "address": "fa:16:3e:3c:70:df", "network": {"id": "76c2d716-7d14-4bc1-b83b-a3290ee99d9a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-782760714-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9156c0a9920c4721843416b9a44404f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab5264b7-ec", "ovs_interfaceid": "ab5264b7-ec64-46dd-b30d-981799387571", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.845 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-7a53c9b1-e64b-4a31-897a-bbe7d964cf45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.846 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.846 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.847 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.847 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.923 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.924 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.924 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.924 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:21:07 np0005588919 nova_compute[225855]: 2026-01-20 15:21:07.924 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:08.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.135 225859 DEBUG nova.network.neutron [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:21:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:08.142 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:21:08 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:08.144 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.145 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.231 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1320144358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.370 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.389 225859 INFO nova.compute.manager [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Took 1.41 seconds to deallocate network for instance.#033[00m
Jan 20 10:21:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.449 225859 DEBUG nova.compute.manager [req-31348548-716d-4350-af2c-6d66a5381f6a req-7d1cc436-8615-4104-8cbf-89c11823449d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-deleted-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.471 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.471 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.549 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.550 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.94268798828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.550 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.659 225859 DEBUG oslo_concurrency.processutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.800 225859 DEBUG nova.compute.manager [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.801 225859 DEBUG oslo_concurrency.lockutils [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.801 225859 DEBUG oslo_concurrency.lockutils [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.802 225859 DEBUG oslo_concurrency.lockutils [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.802 225859 DEBUG nova.compute.manager [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] No waiting events found dispatching network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:21:08 np0005588919 nova_compute[225855]: 2026-01-20 15:21:08.803 225859 WARNING nova.compute.manager [req-8ed1ca59-68fc-4e81-b72b-6b18b3c3876f req-49e253f9-6af9-4564-a9a4-4f987af1b460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Received unexpected event network-vif-plugged-ab5264b7-ec64-46dd-b30d-981799387571 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:21:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2027985971' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.066 225859 DEBUG oslo_concurrency.processutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.071 225859 DEBUG nova.compute.provider_tree [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.180 225859 DEBUG nova.scheduler.client.report [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.224 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.227 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.277 225859 INFO nova.scheduler.client.report [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Deleted allocations for instance 7a53c9b1-e64b-4a31-897a-bbe7d964cf45#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.346 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.346 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.369 225859 DEBUG oslo_concurrency.lockutils [None req-8c4a9634-9fb3-4152-a31f-a9607378737e cd9a8f26b71f4631a387e555e6b18428 9156c0a9920c4721843416b9a44404f9 - - default default] Lock "7a53c9b1-e64b-4a31-897a-bbe7d964cf45" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.371 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:09.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3327566445' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.838 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.843 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.888 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.919 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:21:09 np0005588919 nova_compute[225855]: 2026-01-20 15:21:09.920 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:10.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:10 np0005588919 nova_compute[225855]: 2026-01-20 15:21:10.413 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:10 np0005588919 nova_compute[225855]: 2026-01-20 15:21:10.413 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:10 np0005588919 nova_compute[225855]: 2026-01-20 15:21:10.414 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:10 np0005588919 nova_compute[225855]: 2026-01-20 15:21:10.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:11 np0005588919 nova_compute[225855]: 2026-01-20 15:21:11.543 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:11.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:12.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:13 np0005588919 nova_compute[225855]: 2026-01-20 15:21:13.231 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:13 np0005588919 nova_compute[225855]: 2026-01-20 15:21:13.336 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:13.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:14.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:15.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:15 np0005588919 nova_compute[225855]: 2026-01-20 15:21:15.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:16.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:16.442 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:16.442 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:16.442 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:21:17.147 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:21:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:17.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:18.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:18 np0005588919 nova_compute[225855]: 2026-01-20 15:21:18.233 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:19.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:20 np0005588919 nova_compute[225855]: 2026-01-20 15:21:20.895 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922465.8945441, 7a53c9b1-e64b-4a31-897a-bbe7d964cf45 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:21:20 np0005588919 nova_compute[225855]: 2026-01-20 15:21:20.896 225859 INFO nova.compute.manager [-] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:21:20 np0005588919 nova_compute[225855]: 2026-01-20 15:21:20.924 225859 DEBUG nova.compute.manager [None req-f951f18e-5dc0-473c-b07d-bb8ad696e646 - - - - - -] [instance: 7a53c9b1-e64b-4a31-897a-bbe7d964cf45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:21:21 np0005588919 nova_compute[225855]: 2026-01-20 15:21:21.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:21.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:22.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:22 np0005588919 nova_compute[225855]: 2026-01-20 15:21:22.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:23 np0005588919 nova_compute[225855]: 2026-01-20 15:21:23.235 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:24 np0005588919 podman[309812]: 2026-01-20 15:21:24.05877668 +0000 UTC m=+0.093192070 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 20 10:21:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:24.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:25.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:26 np0005588919 nova_compute[225855]: 2026-01-20 15:21:26.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:26.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:21:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:21:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:21:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:27.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:28.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:28 np0005588919 nova_compute[225855]: 2026-01-20 15:21:28.235 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:29.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:31 np0005588919 nova_compute[225855]: 2026-01-20 15:21:31.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:31.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:32.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:33 np0005588919 nova_compute[225855]: 2026-01-20 15:21:33.237 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:21:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:34.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:35.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:36 np0005588919 podman[310075]: 2026-01-20 15:21:36.001703144 +0000 UTC m=+0.051589007 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 10:21:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:36.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:36 np0005588919 nova_compute[225855]: 2026-01-20 15:21:36.119 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:37.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:38.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:38 np0005588919 nova_compute[225855]: 2026-01-20 15:21:38.288 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:39.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:40.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:41 np0005588919 nova_compute[225855]: 2026-01-20 15:21:41.121 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:41.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:42.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:43 np0005588919 nova_compute[225855]: 2026-01-20 15:21:43.289 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:43.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:44.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:45.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:46.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:46 np0005588919 nova_compute[225855]: 2026-01-20 15:21:46.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:47.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:48.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.292 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.606 225859 DEBUG nova.compute.manager [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.820 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.821 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.840 225859 DEBUG nova.objects.instance [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_requests' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.854 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.854 225859 INFO nova.compute.claims [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.855 225859 DEBUG nova.objects.instance [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.863 225859 DEBUG nova.objects.instance [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.938 225859 INFO nova.compute.resource_tracker [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating resource usage from migration 9b9a1c84-0aa6-44a5-9e94-a94a60dd6646#033[00m
Jan 20 10:21:48 np0005588919 nova_compute[225855]: 2026-01-20 15:21:48.939 225859 DEBUG nova.compute.resource_tracker [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Starting to track incoming migration 9b9a1c84-0aa6-44a5-9e94-a94a60dd6646 with flavor 30c26a27-d918-46d8-a512-4ef3b4ce5955 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.061 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:21:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:21:49 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3610823315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.511 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.516 225859 DEBUG nova.compute.provider_tree [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.543 225859 DEBUG nova.scheduler.client.report [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.575 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:49 np0005588919 nova_compute[225855]: 2026-01-20 15:21:49.575 225859 INFO nova.compute.manager [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Migrating#033[00m
Jan 20 10:21:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:49.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:50.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:21:50 np0005588919 systemd-logind[783]: New session 72 of user nova.
Jan 20 10:21:50 np0005588919 systemd[1]: Created slice User Slice of UID 42436.
Jan 20 10:21:50 np0005588919 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 20 10:21:50 np0005588919 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 20 10:21:50 np0005588919 systemd[1]: Starting User Manager for UID 42436...
Jan 20 10:21:50 np0005588919 systemd[310179]: Queued start job for default target Main User Target.
Jan 20 10:21:51 np0005588919 systemd[310179]: Created slice User Application Slice.
Jan 20 10:21:51 np0005588919 systemd[310179]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:21:51 np0005588919 systemd[310179]: Started Daily Cleanup of User's Temporary Directories.
Jan 20 10:21:51 np0005588919 systemd[310179]: Reached target Paths.
Jan 20 10:21:51 np0005588919 systemd[310179]: Reached target Timers.
Jan 20 10:21:51 np0005588919 systemd[310179]: Starting D-Bus User Message Bus Socket...
Jan 20 10:21:51 np0005588919 systemd[310179]: Starting Create User's Volatile Files and Directories...
Jan 20 10:21:51 np0005588919 systemd[310179]: Finished Create User's Volatile Files and Directories.
Jan 20 10:21:51 np0005588919 systemd[310179]: Listening on D-Bus User Message Bus Socket.
Jan 20 10:21:51 np0005588919 systemd[310179]: Reached target Sockets.
Jan 20 10:21:51 np0005588919 systemd[310179]: Reached target Basic System.
Jan 20 10:21:51 np0005588919 systemd[310179]: Reached target Main User Target.
Jan 20 10:21:51 np0005588919 systemd[310179]: Startup finished in 131ms.
Jan 20 10:21:51 np0005588919 systemd[1]: Started User Manager for UID 42436.
Jan 20 10:21:51 np0005588919 systemd[1]: Started Session 72 of User nova.
Jan 20 10:21:51 np0005588919 systemd[1]: session-72.scope: Deactivated successfully.
Jan 20 10:21:51 np0005588919 systemd-logind[783]: Session 72 logged out. Waiting for processes to exit.
Jan 20 10:21:51 np0005588919 systemd-logind[783]: Removed session 72.
Jan 20 10:21:51 np0005588919 nova_compute[225855]: 2026-01-20 15:21:51.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:51 np0005588919 systemd-logind[783]: New session 74 of user nova.
Jan 20 10:21:51 np0005588919 systemd[1]: Started Session 74 of User nova.
Jan 20 10:21:51 np0005588919 systemd[1]: session-74.scope: Deactivated successfully.
Jan 20 10:21:51 np0005588919 systemd-logind[783]: Session 74 logged out. Waiting for processes to exit.
Jan 20 10:21:51 np0005588919 systemd-logind[783]: Removed session 74.
Jan 20 10:21:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:51.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:52.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:53 np0005588919 nova_compute[225855]: 2026-01-20 15:21:53.297 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:53.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:54.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.245 225859 DEBUG nova.compute.manager [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.245 225859 DEBUG oslo_concurrency.lockutils [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.245 225859 DEBUG oslo_concurrency.lockutils [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.246 225859 DEBUG oslo_concurrency.lockutils [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.246 225859 DEBUG nova.compute.manager [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:21:54 np0005588919 nova_compute[225855]: 2026-01-20 15:21:54.246 225859 WARNING nova.compute.manager [req-384c106f-c8da-4c7d-9edd-b7cf32361a1d req-5b8050a1-452d-45ac-bafa-e98f6c2fa345 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received unexpected event network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 20 10:21:55 np0005588919 podman[310203]: 2026-01-20 15:21:55.072602802 +0000 UTC m=+0.120346137 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:21:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:21:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:55.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:21:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:56.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.411 225859 DEBUG nova.compute.manager [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.412 225859 DEBUG oslo_concurrency.lockutils [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.413 225859 DEBUG oslo_concurrency.lockutils [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.413 225859 DEBUG oslo_concurrency.lockutils [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.413 225859 DEBUG nova.compute.manager [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:21:56 np0005588919 nova_compute[225855]: 2026-01-20 15:21:56.413 225859 WARNING nova.compute.manager [req-c0dd9026-9437-4203-a4c8-b0b1f917be34 req-116c3512-6ea9-4060-b99e-5eeb18df2d6b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received unexpected event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 20 10:21:57 np0005588919 nova_compute[225855]: 2026-01-20 15:21:57.053 225859 INFO nova.network.neutron [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating port 2eefbfcb-7c22-4c45-bb7b-75319242796c with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 20 10:21:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:21:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:57.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:21:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:21:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:21:58.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.275 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.275 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.275 225859 DEBUG nova.network.neutron [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.421 225859 DEBUG nova.compute.manager [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-changed-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.421 225859 DEBUG nova.compute.manager [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Refreshing instance network info cache due to event network-changed-2eefbfcb-7c22-4c45-bb7b-75319242796c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:21:58 np0005588919 nova_compute[225855]: 2026-01-20 15:21:58.422 225859 DEBUG oslo_concurrency.lockutils [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:21:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:21:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:21:58Z|00882|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 10:21:59 np0005588919 nova_compute[225855]: 2026-01-20 15:21:59.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:21:59 np0005588919 nova_compute[225855]: 2026-01-20 15:21:59.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:21:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:21:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:21:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:21:59.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:00.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.179 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:01 np0005588919 systemd[1]: Stopping User Manager for UID 42436...
Jan 20 10:22:01 np0005588919 systemd[310179]: Activating special unit Exit the Session...
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped target Main User Target.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped target Basic System.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped target Paths.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped target Sockets.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped target Timers.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 20 10:22:01 np0005588919 systemd[310179]: Closed D-Bus User Message Bus Socket.
Jan 20 10:22:01 np0005588919 systemd[310179]: Stopped Create User's Volatile Files and Directories.
Jan 20 10:22:01 np0005588919 systemd[310179]: Removed slice User Application Slice.
Jan 20 10:22:01 np0005588919 systemd[310179]: Reached target Shutdown.
Jan 20 10:22:01 np0005588919 systemd[310179]: Finished Exit the Session.
Jan 20 10:22:01 np0005588919 systemd[310179]: Reached target Exit the Session.
Jan 20 10:22:01 np0005588919 systemd[1]: user@42436.service: Deactivated successfully.
Jan 20 10:22:01 np0005588919 systemd[1]: Stopped User Manager for UID 42436.
Jan 20 10:22:01 np0005588919 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 20 10:22:01 np0005588919 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 20 10:22:01 np0005588919 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 20 10:22:01 np0005588919 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 20 10:22:01 np0005588919 systemd[1]: Removed slice User Slice of UID 42436.
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.480 225859 DEBUG nova.network.neutron [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating instance_info_cache with network_info: [{"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.517 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.521 225859 DEBUG oslo_concurrency.lockutils [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.521 225859 DEBUG nova.network.neutron [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Refreshing network info cache for port 2eefbfcb-7c22-4c45-bb7b-75319242796c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.658 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.660 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.660 225859 INFO nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Creating image(s)#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.695 225859 DEBUG nova.storage.rbd_utils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] creating snapshot(nova-resize) on rbd image(65aa2157-f058-4e5c-b448-64cf956310ba_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 20 10:22:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:01.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:01 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.855 225859 DEBUG nova.objects.instance [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'trusted_certs' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.954 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.954 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Ensure instance console log exists: /var/lib/nova/instances/65aa2157-f058-4e5c-b448-64cf956310ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.955 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.955 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.955 225859 DEBUG oslo_concurrency.lockutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.958 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Start _get_guest_xml network_info=[{"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1810657086", "vif_mac": "fa:16:3e:7a:99:f8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.963 225859 WARNING nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.971 225859 DEBUG nova.virt.libvirt.host [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.971 225859 DEBUG nova.virt.libvirt.host [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.974 225859 DEBUG nova.virt.libvirt.host [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.975 225859 DEBUG nova.virt.libvirt.host [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.976 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.976 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='30c26a27-d918-46d8-a512-4ef3b4ce5955',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.976 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.976 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.977 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.977 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.977 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.977 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.977 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.978 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.978 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.978 225859 DEBUG nova.virt.hardware [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.978 225859 DEBUG nova.objects.instance [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'vcpu_model' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:01 np0005588919 nova_compute[225855]: 2026-01-20 15:22:01.993 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:02.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:22:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1804081147' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.433 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.469 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.881 225859 DEBUG nova.network.neutron [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updated VIF entry in instance network info cache for port 2eefbfcb-7c22-4c45-bb7b-75319242796c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.882 225859 DEBUG nova.network.neutron [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating instance_info_cache with network_info: [{"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.905 225859 DEBUG oslo_concurrency.lockutils [req-77276c5d-236f-4ee6-af46-35521532af4d req-d09a6e41-cd28-413f-ae3a-b5d4adcb2c3c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:22:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:22:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1822320099' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.988 225859 DEBUG oslo_concurrency.processutils [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.989 225859 DEBUG nova.virt.libvirt.vif [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-890684112',display_name='tempest-TestNetworkAdvancedServerOps-server-890684112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-890684112',id=198,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBESV7XYmzz1neUUH7k/g2EXDk6RAN24jF19myyoRv6wDjFXd5E2VXPhzcf3Q2CFmKA+oZARXh9ZLZnZRzD1iPeEGFbgLb8nt50MGrmQlAcYMGRSCqrzrniFYSfPnybQWNg==',key_name='tempest-TestNetworkAdvancedServerOps-1160843308',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:21:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-n64n905g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:21:55Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=65aa2157-f058-4e5c-b448-64cf956310ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1810657086", "vif_mac": "fa:16:3e:7a:99:f8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.990 225859 DEBUG nova.network.os_vif_util [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1810657086", "vif_mac": "fa:16:3e:7a:99:f8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.991 225859 DEBUG nova.network.os_vif_util [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.994 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <uuid>65aa2157-f058-4e5c-b448-64cf956310ba</uuid>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <name>instance-000000c6</name>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <memory>196608</memory>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-890684112</nova:name>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:22:01</nova:creationTime>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.micro">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:memory>192</nova:memory>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <nova:port uuid="2eefbfcb-7c22-4c45-bb7b-75319242796c">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="serial">65aa2157-f058-4e5c-b448-64cf956310ba</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="uuid">65aa2157-f058-4e5c-b448-64cf956310ba</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/65aa2157-f058-4e5c-b448-64cf956310ba_disk">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/65aa2157-f058-4e5c-b448-64cf956310ba_disk.config">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:7a:99:f8"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <target dev="tap2eefbfcb-7c"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/65aa2157-f058-4e5c-b448-64cf956310ba/console.log" append="off"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:22:02 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:22:02 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:22:02 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:22:02 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.995 225859 DEBUG nova.virt.libvirt.vif [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-890684112',display_name='tempest-TestNetworkAdvancedServerOps-server-890684112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-890684112',id=198,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBESV7XYmzz1neUUH7k/g2EXDk6RAN24jF19myyoRv6wDjFXd5E2VXPhzcf3Q2CFmKA+oZARXh9ZLZnZRzD1iPeEGFbgLb8nt50MGrmQlAcYMGRSCqrzrniFYSfPnybQWNg==',key_name='tempest-TestNetworkAdvancedServerOps-1160843308',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:21:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-n64n905g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:21:55Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=65aa2157-f058-4e5c-b448-64cf956310ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1810657086", "vif_mac": "fa:16:3e:7a:99:f8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.996 225859 DEBUG nova.network.os_vif_util [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1810657086", "vif_mac": "fa:16:3e:7a:99:f8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.996 225859 DEBUG nova.network.os_vif_util [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.997 225859 DEBUG os_vif [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.997 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.998 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:02 np0005588919 nova_compute[225855]: 2026-01-20 15:22:02.998 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.001 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eefbfcb-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.002 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2eefbfcb-7c, col_values=(('external_ids', {'iface-id': '2eefbfcb-7c22-4c45-bb7b-75319242796c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:99:f8', 'vm-uuid': '65aa2157-f058-4e5c-b448-64cf956310ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.003 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.0048] manager: (tap2eefbfcb-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.011 225859 INFO os_vif [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c')#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.067 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.068 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.068 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:7a:99:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.069 225859 INFO nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Using config drive#033[00m
Jan 20 10:22:03 np0005588919 kernel: tap2eefbfcb-7c: entered promiscuous mode
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.160 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.1617] manager: (tap2eefbfcb-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/369)
Jan 20 10:22:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:03Z|00883|binding|INFO|Claiming lport 2eefbfcb-7c22-4c45-bb7b-75319242796c for this chassis.
Jan 20 10:22:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:03Z|00884|binding|INFO|2eefbfcb-7c22-4c45-bb7b-75319242796c: Claiming fa:16:3e:7a:99:f8 10.100.0.4
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.170 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:99:f8 10.100.0.4'], port_security=['fa:16:3e:7a:99:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '65aa2157-f058-4e5c-b448-64cf956310ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a9217303-0a2c-4a19-a65b-396cb455c1f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93c7ac88-5c28-4609-8d16-8949ae99e457, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2eefbfcb-7c22-4c45-bb7b-75319242796c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.171 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2eefbfcb-7c22-4c45-bb7b-75319242796c in datapath 6eb3ab38-e480-46b8-ae2d-d286fe61de3c bound to our chassis#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.172 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6eb3ab38-e480-46b8-ae2d-d286fe61de3c#033[00m
Jan 20 10:22:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:03Z|00885|binding|INFO|Setting lport 2eefbfcb-7c22-4c45-bb7b-75319242796c ovn-installed in OVS
Jan 20 10:22:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:03Z|00886|binding|INFO|Setting lport 2eefbfcb-7c22-4c45-bb7b-75319242796c up in Southbound
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.180 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.184 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4db45d-49c6-4f88-889c-af6de445e85c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.185 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6eb3ab38-e1 in ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.187 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6eb3ab38-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.187 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c19228ae-1cd3-4fa9-86cd-bffeeae55b4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.189 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd1bf0b-c912-4ee1-9d46-8a9388981a26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 systemd-machined[194361]: New machine qemu-104-instance-000000c6.
Jan 20 10:22:03 np0005588919 systemd-udevd[310402]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.201 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[72659f65-4d5e-4940-92fb-d71b3e4ee0a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 systemd[1]: Started Virtual Machine qemu-104-instance-000000c6.
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.2079] device (tap2eefbfcb-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.2085] device (tap2eefbfcb-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.216 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3d24cb-28fb-44e0-9772-01f11bc9581e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.246 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ced301ae-070e-4cae-bcd9-10b855f3de33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.2517] manager: (tap6eb3ab38-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/370)
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.251 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b06ed79d-700c-4ccc-ab94-799a34b01705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.285 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd049be-4d21-40a7-a8df-c614f27d6a38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.288 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[abd9236a-7e8b-4fe1-9bde-af467dc6b3c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.3104] device (tap6eb3ab38-e0): carrier: link connected
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.316 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4d01f879-28c5-4fad-8ba3-4a87c80cd384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.334 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[dab38f46-c3df-4561-ac8d-6da6cde3e169]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6eb3ab38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:42:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 756223, 'reachable_time': 15392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310434, 'error': None, 'target': 'ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.351 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a4018002-b985-4747-af29-cb29ddc078db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:428f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 756223, 'tstamp': 756223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310435, 'error': None, 'target': 'ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.372 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f74302e1-f2ad-47bf-af40-44e4330e6b42]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6eb3ab38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:42:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 756223, 'reachable_time': 15392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310436, 'error': None, 'target': 'ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.403 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6e0e21-0b59-44cd-b26e-83f1d4e9827e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.457 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bb1b0e-4e56-4d61-81a6-c07dbbb5a004]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.459 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6eb3ab38-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.459 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.459 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6eb3ab38-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:03 np0005588919 kernel: tap6eb3ab38-e0: entered promiscuous mode
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.461 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 NetworkManager[49104]: <info>  [1768922523.4617] manager: (tap6eb3ab38-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.465 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6eb3ab38-e0, col_values=(('external_ids', {'iface-id': 'f6896e14-17f7-4c25-9eea-77cd7f8fe02c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.466 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:03Z|00887|binding|INFO|Releasing lport f6896e14-17f7-4c25-9eea-77cd7f8fe02c from this chassis (sb_readonly=0)
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.468 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6eb3ab38-e480-46b8-ae2d-d286fe61de3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6eb3ab38-e480-46b8-ae2d-d286fe61de3c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.479 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4af1baba-b958-462b-8ea5-dc64b5fbc000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.480 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-6eb3ab38-e480-46b8-ae2d-d286fe61de3c
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/6eb3ab38-e480-46b8-ae2d-d286fe61de3c.pid.haproxy
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 6eb3ab38-e480-46b8-ae2d-d286fe61de3c
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:03.480 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'env', 'PROCESS_TAG=haproxy-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6eb3ab38-e480-46b8-ae2d-d286fe61de3c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:22:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:03.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.779 225859 DEBUG nova.compute.manager [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.780 225859 DEBUG oslo_concurrency.lockutils [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.780 225859 DEBUG oslo_concurrency.lockutils [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.780 225859 DEBUG oslo_concurrency.lockutils [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.780 225859 DEBUG nova.compute.manager [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:03 np0005588919 nova_compute[225855]: 2026-01-20 15:22:03.781 225859 WARNING nova.compute.manager [req-83a6fbbf-3571-4f8e-88b6-d8bc4a976c99 req-b7ac87d6-24d0-4a16-b5ee-50c2882cb493 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received unexpected event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with vm_state active and task_state resize_finish.#033[00m
Jan 20 10:22:03 np0005588919 podman[310468]: 2026-01-20 15:22:03.802645608 +0000 UTC m=+0.024427660 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:22:04 np0005588919 podman[310468]: 2026-01-20 15:22:04.043832775 +0000 UTC m=+0.265614827 container create c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:22:04 np0005588919 systemd[1]: Started libpod-conmon-c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354.scope.
Jan 20 10:22:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:04.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:22:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a8ddd055a857b793a3cbb612eba64ac8fa8b721d637f2000dc6e7f650142a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:22:04 np0005588919 podman[310468]: 2026-01-20 15:22:04.170626013 +0000 UTC m=+0.392408065 container init c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 20 10:22:04 np0005588919 podman[310468]: 2026-01-20 15:22:04.177059464 +0000 UTC m=+0.398841506 container start c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:22:04 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [NOTICE]   (310511) : New worker (310522) forked
Jan 20 10:22:04 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [NOTICE]   (310511) : Loading success.
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.360 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922524.3597896, 65aa2157-f058-4e5c-b448-64cf956310ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.360 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.362 225859 DEBUG nova.compute.manager [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.365 225859 INFO nova.virt.libvirt.driver [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Instance running successfully.#033[00m
Jan 20 10:22:04 np0005588919 virtqemud[225396]: argument unsupported: QEMU guest agent is not configured
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.368 225859 DEBUG nova.virt.libvirt.guest [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.368 225859 DEBUG nova.virt.libvirt.driver [None req-2dfa6b97-6b8c-4dc9-a6b3-7f0c65eddc84 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.375 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.375 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.393 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.399 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.453 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.453 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922524.361839, 65aa2157-f058-4e5c-b448-64cf956310ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.454 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] VM Started (Lifecycle Event)#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.476 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:22:04 np0005588919 nova_compute[225855]: 2026-01-20 15:22:04.480 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:22:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:05.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:06.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.142 225859 DEBUG nova.compute.manager [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.143 225859 DEBUG oslo_concurrency.lockutils [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.143 225859 DEBUG oslo_concurrency.lockutils [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.143 225859 DEBUG oslo_concurrency.lockutils [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.143 225859 DEBUG nova.compute.manager [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:06 np0005588919 nova_compute[225855]: 2026-01-20 15:22:06.144 225859 WARNING nova.compute.manager [req-f7e34805-2e49-4fd7-98e5-6b9b544da6bd req-9a930593-e911-4620-9244-ae9145ec9e0f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received unexpected event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with vm_state resized and task_state None.#033[00m
Jan 20 10:22:06 np0005588919 podman[310590]: 2026-01-20 15:22:06.218790961 +0000 UTC m=+0.058604094 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.363 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating instance_info_cache with network_info: [{"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.507 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.507 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.508 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.508 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.508 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.600 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.601 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.602 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.602 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:22:07 np0005588919 nova_compute[225855]: 2026-01-20 15:22:07.603 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:07.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.004 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3733158156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.051 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:08.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.275 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.276 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.435 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.436 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4104MB free_disk=20.897098541259766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.436 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.437 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.535 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Applying migration context for instance 65aa2157-f058-4e5c-b448-64cf956310ba as it has an incoming, in-progress migration 9b9a1c84-0aa6-44a5-9e94-a94a60dd6646. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.536 225859 INFO nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating resource usage from migration 9b9a1c84-0aa6-44a5-9e94-a94a60dd6646#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.573 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 65aa2157-f058-4e5c-b448-64cf956310ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.573 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.573 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:22:08 np0005588919 nova_compute[225855]: 2026-01-20 15:22:08.609 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4092149460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.030 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.035 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.090 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.114 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.115 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:09.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.946 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:09 np0005588919 nova_compute[225855]: 2026-01-20 15:22:09.947 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:10.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:10 np0005588919 nova_compute[225855]: 2026-01-20 15:22:10.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:10.593 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:22:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:10.594 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:22:10 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:10.595 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:10 np0005588919 nova_compute[225855]: 2026-01-20 15:22:10.596 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 20 10:22:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:11.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:12.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:13 np0005588919 nova_compute[225855]: 2026-01-20 15:22:13.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:13 np0005588919 nova_compute[225855]: 2026-01-20 15:22:13.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:13 np0005588919 nova_compute[225855]: 2026-01-20 15:22:13.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:22:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:22:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1403667597' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:22:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:22:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1403667597' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:22:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:13.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:14.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:15.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:16.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:22:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/170002193' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:22:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:22:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/170002193' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:22:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:16.443 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:16.444 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:16.444 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:18 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:18Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:99:f8 10.100.0.4
Jan 20 10:22:18 np0005588919 nova_compute[225855]: 2026-01-20 15:22:18.009 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:18.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:18 np0005588919 nova_compute[225855]: 2026-01-20 15:22:18.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:19.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:20.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:21.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:22.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:22Z|00888|binding|INFO|Releasing lport f6896e14-17f7-4c25-9eea-77cd7f8fe02c from this chassis (sb_readonly=0)
Jan 20 10:22:22 np0005588919 nova_compute[225855]: 2026-01-20 15:22:22.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 20 10:22:23 np0005588919 nova_compute[225855]: 2026-01-20 15:22:23.012 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:23 np0005588919 nova_compute[225855]: 2026-01-20 15:22:23.307 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:23.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:24.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:24 np0005588919 nova_compute[225855]: 2026-01-20 15:22:24.282 225859 INFO nova.compute.manager [None req-502f0124-d88b-4c6c-a015-c2d226e8e549 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Get console output#033[00m
Jan 20 10:22:24 np0005588919 nova_compute[225855]: 2026-01-20 15:22:24.288 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.606 225859 DEBUG nova.compute.manager [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-changed-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.607 225859 DEBUG nova.compute.manager [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Refreshing instance network info cache due to event network-changed-2eefbfcb-7c22-4c45-bb7b-75319242796c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.607 225859 DEBUG oslo_concurrency.lockutils [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.607 225859 DEBUG oslo_concurrency.lockutils [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.608 225859 DEBUG nova.network.neutron [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Refreshing network info cache for port 2eefbfcb-7c22-4c45-bb7b-75319242796c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.675 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.675 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.676 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.676 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.676 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.678 225859 INFO nova.compute.manager [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Terminating instance#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.679 225859 DEBUG nova.compute.manager [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:22:25 np0005588919 kernel: tap2eefbfcb-7c (unregistering): left promiscuous mode
Jan 20 10:22:25 np0005588919 NetworkManager[49104]: <info>  [1768922545.7242] device (tap2eefbfcb-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:22:25 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:25Z|00889|binding|INFO|Releasing lport 2eefbfcb-7c22-4c45-bb7b-75319242796c from this chassis (sb_readonly=0)
Jan 20 10:22:25 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:25Z|00890|binding|INFO|Setting lport 2eefbfcb-7c22-4c45-bb7b-75319242796c down in Southbound
Jan 20 10:22:25 np0005588919 ovn_controller[130490]: 2026-01-20T15:22:25Z|00891|binding|INFO|Removing iface tap2eefbfcb-7c ovn-installed in OVS
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.736 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:25.740 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:99:f8 10.100.0.4'], port_security=['fa:16:3e:7a:99:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '65aa2157-f058-4e5c-b448-64cf956310ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a9217303-0a2c-4a19-a65b-396cb455c1f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93c7ac88-5c28-4609-8d16-8949ae99e457, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2eefbfcb-7c22-4c45-bb7b-75319242796c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:22:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:25.742 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2eefbfcb-7c22-4c45-bb7b-75319242796c in datapath 6eb3ab38-e480-46b8-ae2d-d286fe61de3c unbound from our chassis#033[00m
Jan 20 10:22:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:25.742 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6eb3ab38-e480-46b8-ae2d-d286fe61de3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:22:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:25.744 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8d350159-204a-4b6a-a521-4caed85b4913]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:25 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:25.745 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c namespace which is not needed anymore#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.755 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:25.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:25 np0005588919 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000c6.scope: Deactivated successfully.
Jan 20 10:22:25 np0005588919 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000c6.scope: Consumed 14.694s CPU time.
Jan 20 10:22:25 np0005588919 systemd-machined[194361]: Machine qemu-104-instance-000000c6 terminated.
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.905 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [NOTICE]   (310511) : haproxy version is 2.8.14-c23fe91
Jan 20 10:22:25 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [NOTICE]   (310511) : path to executable is /usr/sbin/haproxy
Jan 20 10:22:25 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [WARNING]  (310511) : Exiting Master process...
Jan 20 10:22:25 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [ALERT]    (310511) : Current worker (310522) exited with code 143 (Terminated)
Jan 20 10:22:25 np0005588919 neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c[310482]: [WARNING]  (310511) : All workers exited. Exiting... (0)
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.911 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 systemd[1]: libpod-c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354.scope: Deactivated successfully.
Jan 20 10:22:25 np0005588919 podman[310714]: 2026-01-20 15:22:25.918146927 +0000 UTC m=+0.165582709 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 20 10:22:25 np0005588919 podman[310758]: 2026-01-20 15:22:25.921550594 +0000 UTC m=+0.082466275 container died c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.922 225859 INFO nova.virt.libvirt.driver [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Instance destroyed successfully.#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.923 225859 DEBUG nova.objects.instance [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 65aa2157-f058-4e5c-b448-64cf956310ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.945 225859 DEBUG nova.virt.libvirt.vif [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-890684112',display_name='tempest-TestNetworkAdvancedServerOps-server-890684112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-890684112',id=198,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBESV7XYmzz1neUUH7k/g2EXDk6RAN24jF19myyoRv6wDjFXd5E2VXPhzcf3Q2CFmKA+oZARXh9ZLZnZRzD1iPeEGFbgLb8nt50MGrmQlAcYMGRSCqrzrniFYSfPnybQWNg==',key_name='tempest-TestNetworkAdvancedServerOps-1160843308',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:22:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-n64n905g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:22:11Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=65aa2157-f058-4e5c-b448-64cf956310ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.947 225859 DEBUG nova.network.os_vif_util [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.949 225859 DEBUG nova.network.os_vif_util [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.950 225859 DEBUG os_vif [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:22:25 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354-userdata-shm.mount: Deactivated successfully.
Jan 20 10:22:25 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a9a8ddd055a857b793a3cbb612eba64ac8fa8b721d637f2000dc6e7f650142a8-merged.mount: Deactivated successfully.
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.953 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.954 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eefbfcb-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.956 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.959 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:22:25 np0005588919 nova_compute[225855]: 2026-01-20 15:22:25.964 225859 INFO os_vif [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:99:f8,bridge_name='br-int',has_traffic_filtering=True,id=2eefbfcb-7c22-4c45-bb7b-75319242796c,network=Network(6eb3ab38-e480-46b8-ae2d-d286fe61de3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eefbfcb-7c')#033[00m
Jan 20 10:22:25 np0005588919 podman[310758]: 2026-01-20 15:22:25.966435445 +0000 UTC m=+0.127351126 container cleanup c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:22:25 np0005588919 systemd[1]: libpod-conmon-c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354.scope: Deactivated successfully.
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.019 225859 DEBUG nova.compute.manager [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.020 225859 DEBUG oslo_concurrency.lockutils [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.020 225859 DEBUG oslo_concurrency.lockutils [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.020 225859 DEBUG oslo_concurrency.lockutils [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.020 225859 DEBUG nova.compute.manager [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.021 225859 DEBUG nova.compute.manager [req-7920d1d3-9b2b-473c-9a1d-9f1d9f85e556 req-b9a4e964-25bc-46f9-9236-6d6e5ea9db20 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-unplugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:22:26 np0005588919 podman[310810]: 2026-01-20 15:22:26.040037497 +0000 UTC m=+0.050422131 container remove c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.047 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0998a0ea-0c0f-4d0a-b08a-741e45441e92]: (4, ('Tue Jan 20 03:22:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c (c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354)\nc7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354\nTue Jan 20 03:22:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c (c7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354)\nc7fab9924b227159e9767e068e64a7972c66e5c4ddb35306b896a5152ff90354\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.049 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3ea942-8ece-4236-a5f4-db271b3f1d04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.051 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6eb3ab38-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.053 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:26 np0005588919 kernel: tap6eb3ab38-e0: left promiscuous mode
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.068 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.072 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f40fae-3d2c-405f-b249-4fa8653d5fd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.091 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c7cfeb-dbfa-4508-afb9-4f2d040e0043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.093 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7599edd6-032c-4134-b2d3-a9dc1f52b1b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.110 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a47af88e-731c-467c-b662-66dc0eeb74d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 756216, 'reachable_time': 19066, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310836, 'error': None, 'target': 'ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 systemd[1]: run-netns-ovnmeta\x2d6eb3ab38\x2de480\x2d46b8\x2dae2d\x2dd286fe61de3c.mount: Deactivated successfully.
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.115 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6eb3ab38-e480-46b8-ae2d-d286fe61de3c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:22:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:22:26.115 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[199eba97-a2fa-4a81-b735-92a34a2691a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:22:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:26.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.526 225859 INFO nova.virt.libvirt.driver [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Deleting instance files /var/lib/nova/instances/65aa2157-f058-4e5c-b448-64cf956310ba_del#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.527 225859 INFO nova.virt.libvirt.driver [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Deletion of /var/lib/nova/instances/65aa2157-f058-4e5c-b448-64cf956310ba_del complete#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.576 225859 INFO nova.compute.manager [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Took 0.90 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.577 225859 DEBUG oslo.service.loopingcall [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.577 225859 DEBUG nova.compute.manager [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.578 225859 DEBUG nova.network.neutron [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.856 225859 DEBUG nova.network.neutron [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updated VIF entry in instance network info cache for port 2eefbfcb-7c22-4c45-bb7b-75319242796c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.857 225859 DEBUG nova.network.neutron [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating instance_info_cache with network_info: [{"id": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "address": "fa:16:3e:7a:99:f8", "network": {"id": "6eb3ab38-e480-46b8-ae2d-d286fe61de3c", "bridge": "br-int", "label": "tempest-network-smoke--1810657086", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eefbfcb-7c", "ovs_interfaceid": "2eefbfcb-7c22-4c45-bb7b-75319242796c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:26 np0005588919 nova_compute[225855]: 2026-01-20 15:22:26.890 225859 DEBUG oslo_concurrency.lockutils [req-8d8c6560-d5dc-4b7f-9423-393401c984e3 req-3c2bd7cd-16a2-4fd4-bed4-6df2d05542f1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-65aa2157-f058-4e5c-b448-64cf956310ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.078 225859 DEBUG nova.network.neutron [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.100 225859 INFO nova.compute.manager [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Took 0.52 seconds to deallocate network for instance.#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.151 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.151 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.185 225859 DEBUG nova.compute.manager [req-7210697d-abfa-4ffe-afb5-0829aea6713c req-df827cb6-1886-450c-b69f-b92a76ed0a7f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-deleted-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.216 225859 DEBUG oslo_concurrency.processutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1554924812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.670 225859 DEBUG oslo_concurrency.processutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.678 225859 DEBUG nova.compute.provider_tree [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.698 225859 DEBUG nova.scheduler.client.report [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.720 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.745 225859 INFO nova.scheduler.client.report [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance 65aa2157-f058-4e5c-b448-64cf956310ba#033[00m
Jan 20 10:22:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:27.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:27 np0005588919 nova_compute[225855]: 2026-01-20 15:22:27.816 225859 DEBUG oslo_concurrency.lockutils [None req-08218b91-1d94-42a2-a507-86dc3e555f8f 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.107 225859 DEBUG nova.compute.manager [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.107 225859 DEBUG oslo_concurrency.lockutils [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.108 225859 DEBUG oslo_concurrency.lockutils [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.108 225859 DEBUG oslo_concurrency.lockutils [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "65aa2157-f058-4e5c-b448-64cf956310ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.108 225859 DEBUG nova.compute.manager [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] No waiting events found dispatching network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.108 225859 WARNING nova.compute.manager [req-b5c83043-4b3c-4fff-b193-470c405dd90f req-e464383b-dda6-4085-a4e0-233ae7868e36 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Received unexpected event network-vif-plugged-2eefbfcb-7c22-4c45-bb7b-75319242796c for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:22:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:28.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:28 np0005588919 nova_compute[225855]: 2026-01-20 15:22:28.309 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:29.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:30.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:30 np0005588919 nova_compute[225855]: 2026-01-20 15:22:30.957 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:31.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:32.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:32 np0005588919 nova_compute[225855]: 2026-01-20 15:22:32.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:32 np0005588919 nova_compute[225855]: 2026-01-20 15:22:32.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:33 np0005588919 nova_compute[225855]: 2026-01-20 15:22:33.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:33 np0005588919 podman[311040]: 2026-01-20 15:22:33.364362974 +0000 UTC m=+0.059869770 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:22:33 np0005588919 podman[311040]: 2026-01-20 15:22:33.455344712 +0000 UTC m=+0.150851528 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:22:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:33.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:34 np0005588919 podman[311195]: 2026-01-20 15:22:34.018136191 +0000 UTC m=+0.050450732 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:22:34 np0005588919 podman[311195]: 2026-01-20 15:22:34.023603437 +0000 UTC m=+0.055917958 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:22:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:34.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:34 np0005588919 podman[311261]: 2026-01-20 15:22:34.235467046 +0000 UTC m=+0.067205680 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.openshift.expose-services=, name=keepalived, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, distribution-scope=public, description=keepalived for Ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 20 10:22:35 np0005588919 podman[311261]: 2026-01-20 15:22:35.459468663 +0000 UTC m=+1.291207267 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, version=2.2.4, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, description=keepalived for Ceph, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Jan 20 10:22:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:35.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:35 np0005588919 nova_compute[225855]: 2026-01-20 15:22:35.959 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:36.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:36 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:22:37 np0005588919 podman[311426]: 2026-01-20 15:22:37.006728458 +0000 UTC m=+0.051685216 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:22:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:37.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:38.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:38 np0005588919 nova_compute[225855]: 2026-01-20 15:22:38.313 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:39.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:40.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:40 np0005588919 nova_compute[225855]: 2026-01-20 15:22:40.920 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922545.9189079, 65aa2157-f058-4e5c-b448-64cf956310ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:22:40 np0005588919 nova_compute[225855]: 2026-01-20 15:22:40.921 225859 INFO nova.compute.manager [-] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:22:40 np0005588919 nova_compute[225855]: 2026-01-20 15:22:40.945 225859 DEBUG nova.compute.manager [None req-ffaf5845-51ae-44d5-a996-05b0d3b8fdfe - - - - - -] [instance: 65aa2157-f058-4e5c-b448-64cf956310ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:22:40 np0005588919 nova_compute[225855]: 2026-01-20 15:22:40.961 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:41.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:42.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:22:43 np0005588919 nova_compute[225855]: 2026-01-20 15:22:43.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:22:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:43.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:22:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:44.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:45.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:46 np0005588919 nova_compute[225855]: 2026-01-20 15:22:46.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:46.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:47.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:48.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:48 np0005588919 nova_compute[225855]: 2026-01-20 15:22:48.318 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:49.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:50.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:51 np0005588919 nova_compute[225855]: 2026-01-20 15:22:51.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:51.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:52.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.243 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.243 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.257 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.319 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.349 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.350 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.359 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.359 225859 INFO nova.compute.claims [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.449 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:53.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:22:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3700219704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.905 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.911 225859 DEBUG nova.compute.provider_tree [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.928 225859 DEBUG nova.scheduler.client.report [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.952 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.953 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.994 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:22:53 np0005588919 nova_compute[225855]: 2026-01-20 15:22:53.995 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.015 225859 INFO nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.031 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.123 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.124 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.125 225859 INFO nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Creating image(s)#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.149 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.172 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:22:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:54.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.199 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.204 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.235 225859 DEBUG nova.policy [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '442a7a5cb8ea426a82be9762b262d171', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.275 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.276 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.276 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.277 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.299 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.302 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.598 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.649 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] resizing rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.732 225859 DEBUG nova.objects.instance [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'migration_context' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.747 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.748 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Ensure instance console log exists: /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.748 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.748 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.749 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:22:54 np0005588919 nova_compute[225855]: 2026-01-20 15:22:54.957 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Successfully created port: 5ee850cb-507f-4288-993f-a7892e9285c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:22:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:56 np0005588919 nova_compute[225855]: 2026-01-20 15:22:56.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:56.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:57 np0005588919 podman[311746]: 2026-01-20 15:22:57.033118985 +0000 UTC m=+0.079018327 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:22:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:57.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.832 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Successfully updated port: 5ee850cb-507f-4288-993f-a7892e9285c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.847 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.848 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.848 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.946 225859 DEBUG nova.compute.manager [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.947 225859 DEBUG nova.compute.manager [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing instance network info cache due to event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.947 225859 DEBUG oslo_concurrency.lockutils [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:22:57 np0005588919 nova_compute[225855]: 2026-01-20 15:22:57.995 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:22:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:22:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:22:58.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:22:58 np0005588919 nova_compute[225855]: 2026-01-20 15:22:58.323 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:22:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:22:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:22:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:22:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:22:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:00.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.208 225859 DEBUG nova.network.neutron [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.815 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.815 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance network_info: |[{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.816 225859 DEBUG oslo_concurrency.lockutils [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.816 225859 DEBUG nova.network.neutron [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.819 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start _get_guest_xml network_info=[{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.822 225859 WARNING nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:23:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.827 225859 DEBUG nova.virt.libvirt.host [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:23:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.828 225859 DEBUG nova.virt.libvirt.host [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.833 225859 DEBUG nova.virt.libvirt.host [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.834 225859 DEBUG nova.virt.libvirt.host [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.835 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.835 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.836 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.836 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.836 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.836 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.836 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.837 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.837 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.837 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.837 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.838 225859 DEBUG nova.virt.hardware [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:23:01 np0005588919 nova_compute[225855]: 2026-01-20 15:23:01.840 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:02.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:23:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3095742531' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.287 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.309 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.313 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:23:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1782031866' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.757 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.759 225859 DEBUG nova.virt.libvirt.vif [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:22:54Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.759 225859 DEBUG nova.network.os_vif_util [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.760 225859 DEBUG nova.network.os_vif_util [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.761 225859 DEBUG nova.objects.instance [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.774 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <uuid>4c926c1a-d5cf-4865-aa57-66b439d115f8</uuid>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <name>instance-000000c8</name>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1485210936</nova:name>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:23:01</nova:creationTime>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <nova:port uuid="5ee850cb-507f-4288-993f-a7892e9285c9">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="serial">4c926c1a-d5cf-4865-aa57-66b439d115f8</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="uuid">4c926c1a-d5cf-4865-aa57-66b439d115f8</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4c926c1a-d5cf-4865-aa57-66b439d115f8_disk">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:cc:0a:92"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <target dev="tap5ee850cb-50"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/console.log" append="off"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:23:02 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:23:02 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:23:02 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:23:02 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.776 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Preparing to wait for external event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.776 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.777 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.777 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.778 225859 DEBUG nova.virt.libvirt.vif [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:22:54Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.778 225859 DEBUG nova.network.os_vif_util [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.779 225859 DEBUG nova.network.os_vif_util [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.779 225859 DEBUG os_vif [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.780 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.780 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.781 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.783 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.783 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5ee850cb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.784 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5ee850cb-50, col_values=(('external_ids', {'iface-id': '5ee850cb-507f-4288-993f-a7892e9285c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:0a:92', 'vm-uuid': '4c926c1a-d5cf-4865-aa57-66b439d115f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:02 np0005588919 NetworkManager[49104]: <info>  [1768922582.7871] manager: (tap5ee850cb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.794 225859 INFO os_vif [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50')#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.843 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.844 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.844 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] No VIF found with MAC fa:16:3e:cc:0a:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.845 225859 INFO nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Using config drive#033[00m
Jan 20 10:23:02 np0005588919 nova_compute[225855]: 2026-01-20 15:23:02.868 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.171 225859 DEBUG nova.network.neutron [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updated VIF entry in instance network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.171 225859 DEBUG nova.network.neutron [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.187 225859 DEBUG oslo_concurrency.lockutils [req-8316f45a-60ef-47bf-8a7c-ae73d249895a req-282130c6-2182-480d-ac22-06c9330ecd05 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.282 225859 INFO nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Creating config drive at /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.287 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx91emwzl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.418 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx91emwzl" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.443 225859 DEBUG nova.storage.rbd_utils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] rbd image 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.447 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.715 225859 DEBUG oslo_concurrency.processutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config 4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.268s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.716 225859 INFO nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Deleting local config drive /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/disk.config because it was imported into RBD.#033[00m
Jan 20 10:23:03 np0005588919 kernel: tap5ee850cb-50: entered promiscuous mode
Jan 20 10:23:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:03Z|00892|binding|INFO|Claiming lport 5ee850cb-507f-4288-993f-a7892e9285c9 for this chassis.
Jan 20 10:23:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:03Z|00893|binding|INFO|5ee850cb-507f-4288-993f-a7892e9285c9: Claiming fa:16:3e:cc:0a:92 10.100.0.12
Jan 20 10:23:03 np0005588919 NetworkManager[49104]: <info>  [1768922583.7776] manager: (tap5ee850cb-50): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.777 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.780 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.792 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:0a:92 10.100.0.12'], port_security=['fa:16:3e:cc:0a:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c926c1a-d5cf-4865-aa57-66b439d115f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f91a697-00ef-4c2f-a4cd-47aa600b459f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a867ed68-782d-4d05-8077-be0278c87405, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5ee850cb-507f-4288-993f-a7892e9285c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.793 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5ee850cb-507f-4288-993f-a7892e9285c9 in datapath 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 bound to our chassis#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.794 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.806 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d9027045-a733-44ec-a371-45ad6863f37f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.807 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a8ba0d6-61 in ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.808 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a8ba0d6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.809 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[daf755e3-95c8-49d2-bd8c-58b2475437b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.809 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bc526f16-e043-4112-989e-645e2a1ea0e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 systemd-udevd[311913]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:23:03 np0005588919 systemd-machined[194361]: New machine qemu-105-instance-000000c8.
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.820 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[79499408-e694-4e66-b7bf-3c77bc7095df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 NetworkManager[49104]: <info>  [1768922583.8320] device (tap5ee850cb-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:23:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:03 np0005588919 NetworkManager[49104]: <info>  [1768922583.8336] device (tap5ee850cb-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:23:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:03.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:03 np0005588919 systemd[1]: Started Virtual Machine qemu-105-instance-000000c8.
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.841 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.846 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bfa168-75d0-43c7-8b8c-15b2e7346823]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:03Z|00894|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 ovn-installed in OVS
Jan 20 10:23:03 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:03Z|00895|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 up in Southbound
Jan 20 10:23:03 np0005588919 nova_compute[225855]: 2026-01-20 15:23:03.848 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.879 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4e08fcae-0b4a-47c7-b555-0f4e9a1bdff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.884 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb23797-a09b-42af-a8c3-b45617b2d655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 systemd-udevd[311917]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:23:03 np0005588919 NetworkManager[49104]: <info>  [1768922583.8860] manager: (tap6a8ba0d6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/374)
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.916 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a73669-44df-42e3-9bf1-56e9ffce9c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.919 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[960157e7-012c-48c2-8ecb-23c3bfa2f63f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 NetworkManager[49104]: <info>  [1768922583.9420] device (tap6a8ba0d6-60): carrier: link connected
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.946 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[7e09d2ea-82f5-45a7-9b05-3220fa398e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.965 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab25d17-fe1d-439f-9cab-2d3cdf52f383]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8ba0d6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:82:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 254], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762286, 'reachable_time': 35060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311945, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.983 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5542ba19-21e8-4b60-a03f-2316725df1ad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:8288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 762286, 'tstamp': 762286}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311946, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:03.998 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e051d4e1-11a7-49ae-860b-35fe647a7837]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8ba0d6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:82:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 254], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762286, 'reachable_time': 35060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 311947, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.033 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc855b3-9296-4fa3-83e8-a9a4a009af0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.118 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[79ab5e1c-2d4d-47ff-9bbd-b10265ff9e27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.120 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8ba0d6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.120 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.121 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a8ba0d6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:04 np0005588919 NetworkManager[49104]: <info>  [1768922584.1235] manager: (tap6a8ba0d6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Jan 20 10:23:04 np0005588919 kernel: tap6a8ba0d6-60: entered promiscuous mode
Jan 20 10:23:04 np0005588919 nova_compute[225855]: 2026-01-20 15:23:04.123 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.125 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a8ba0d6-60, col_values=(('external_ids', {'iface-id': '37682451-9139-425b-b6d7-1ea83a2306c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:04 np0005588919 nova_compute[225855]: 2026-01-20 15:23:04.127 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:04 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:04Z|00896|binding|INFO|Releasing lport 37682451-9139-425b-b6d7-1ea83a2306c3 from this chassis (sb_readonly=1)
Jan 20 10:23:04 np0005588919 nova_compute[225855]: 2026-01-20 15:23:04.141 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.142 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.143 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b03ff4ea-b033-444b-9579-764f9a6997eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.144 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:23:04 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:04.144 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'env', 'PROCESS_TAG=haproxy-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:23:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:04.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:04 np0005588919 nova_compute[225855]: 2026-01-20 15:23:04.373 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922584.373234, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:23:04 np0005588919 nova_compute[225855]: 2026-01-20 15:23:04.374 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Started (Lifecycle Event)#033[00m
Jan 20 10:23:04 np0005588919 podman[312021]: 2026-01-20 15:23:04.508595777 +0000 UTC m=+0.045015026 container create 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:23:04 np0005588919 systemd[1]: Started libpod-conmon-3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30.scope.
Jan 20 10:23:04 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:23:04 np0005588919 podman[312021]: 2026-01-20 15:23:04.485333123 +0000 UTC m=+0.021752402 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:23:04 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba41683a800cd9238c1b5a90bf229c56f6a98cdf524e45addc7193533096210c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:23:04 np0005588919 podman[312021]: 2026-01-20 15:23:04.595376645 +0000 UTC m=+0.131795914 container init 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:23:04 np0005588919 podman[312021]: 2026-01-20 15:23:04.60081542 +0000 UTC m=+0.137234669 container start 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:23:04 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [NOTICE]   (312043) : New worker (312066) forked
Jan 20 10:23:04 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [NOTICE]   (312043) : Loading success.
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.080 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.085 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922584.373421, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.085 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.681 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.684 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:23:05 np0005588919 nova_compute[225855]: 2026-01-20 15:23:05.704 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:23:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:06.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.270 225859 DEBUG nova.compute.manager [req-55862ba5-8d88-4a02-aac0-1afc73eb884f req-659bcf91-6e7c-4eda-8b02-15b7c8c5f50e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.270 225859 DEBUG oslo_concurrency.lockutils [req-55862ba5-8d88-4a02-aac0-1afc73eb884f req-659bcf91-6e7c-4eda-8b02-15b7c8c5f50e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.270 225859 DEBUG oslo_concurrency.lockutils [req-55862ba5-8d88-4a02-aac0-1afc73eb884f req-659bcf91-6e7c-4eda-8b02-15b7c8c5f50e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.270 225859 DEBUG oslo_concurrency.lockutils [req-55862ba5-8d88-4a02-aac0-1afc73eb884f req-659bcf91-6e7c-4eda-8b02-15b7c8c5f50e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.270 225859 DEBUG nova.compute.manager [req-55862ba5-8d88-4a02-aac0-1afc73eb884f req-659bcf91-6e7c-4eda-8b02-15b7c8c5f50e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Processing event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.271 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.274 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922586.2745192, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.275 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.277 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.280 225859 INFO nova.virt.libvirt.driver [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance spawned successfully.#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.280 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.318 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.321 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.328 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.328 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.328 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.329 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.329 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.329 225859 DEBUG nova.virt.libvirt.driver [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.363 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.392 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.392 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.398 225859 INFO nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Took 12.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.398 225859 DEBUG nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.464 225859 INFO nova.compute.manager [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Took 13.15 seconds to build instance.#033[00m
Jan 20 10:23:06 np0005588919 nova_compute[225855]: 2026-01-20 15:23:06.480 225859 DEBUG oslo_concurrency.lockutils [None req-1b898df7-ad12-4947-8ab8-99ba6eee88f5 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.359 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.361 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.361 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:07 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:23:07 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3144324644' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.786 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.801 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:07.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.867 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:23:07 np0005588919 nova_compute[225855]: 2026-01-20 15:23:07.867 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000c8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:23:07 np0005588919 podman[312127]: 2026-01-20 15:23:07.893995095 +0000 UTC m=+0.055974629 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.018 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.019 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4113MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.082 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 4c926c1a-d5cf-4865-aa57-66b439d115f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.082 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.082 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.147 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:08.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.325 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.383 225859 DEBUG nova.compute.manager [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.384 225859 DEBUG oslo_concurrency.lockutils [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.384 225859 DEBUG oslo_concurrency.lockutils [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.384 225859 DEBUG oslo_concurrency.lockutils [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.385 225859 DEBUG nova.compute.manager [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.385 225859 WARNING nova.compute.manager [req-7044eb28-560b-4743-855d-ca1cf2f625a1 req-411c6d42-381a-4572-afc2-19e70fa4f61e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:23:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:23:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2587878333' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.592 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.599 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.617 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.642 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:23:08 np0005588919 nova_compute[225855]: 2026-01-20 15:23:08.643 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:09 np0005588919 nova_compute[225855]: 2026-01-20 15:23:09.644 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:09 np0005588919 nova_compute[225855]: 2026-01-20 15:23:09.645 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:09 np0005588919 nova_compute[225855]: 2026-01-20 15:23:09.645 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:10.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:10 np0005588919 nova_compute[225855]: 2026-01-20 15:23:10.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:11.202 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:11.204 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:23:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:11.205 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:11 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:11Z|00897|binding|INFO|Releasing lport 37682451-9139-425b-b6d7-1ea83a2306c3 from this chassis (sb_readonly=0)
Jan 20 10:23:11 np0005588919 NetworkManager[49104]: <info>  [1768922591.3587] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.358 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:11 np0005588919 NetworkManager[49104]: <info>  [1768922591.3598] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Jan 20 10:23:11 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:11Z|00898|binding|INFO|Releasing lport 37682451-9139-425b-b6d7-1ea83a2306c3 from this chassis (sb_readonly=0)
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.390 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.394 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.737 225859 DEBUG nova.compute.manager [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.737 225859 DEBUG nova.compute.manager [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing instance network info cache due to event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.737 225859 DEBUG oslo_concurrency.lockutils [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.738 225859 DEBUG oslo_concurrency.lockutils [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:23:11 np0005588919 nova_compute[225855]: 2026-01-20 15:23:11.738 225859 DEBUG nova.network.neutron [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:23:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:11.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:12.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:12 np0005588919 nova_compute[225855]: 2026-01-20 15:23:12.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:13 np0005588919 nova_compute[225855]: 2026-01-20 15:23:13.058 225859 DEBUG nova.network.neutron [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updated VIF entry in instance network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:23:13 np0005588919 nova_compute[225855]: 2026-01-20 15:23:13.059 225859 DEBUG nova.network.neutron [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:23:13 np0005588919 nova_compute[225855]: 2026-01-20 15:23:13.081 225859 DEBUG oslo_concurrency.lockutils [req-34831086-4be9-463c-8b2c-d358eb7dfd74 req-0f080f90-3086-44d8-a754-11d3e045bf35 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:23:13 np0005588919 nova_compute[225855]: 2026-01-20 15:23:13.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:23:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3891755442' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:23:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:23:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3891755442' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:23:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:13.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:14.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:14 np0005588919 nova_compute[225855]: 2026-01-20 15:23:14.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:16.444 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:16.444 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:16.445 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:17 np0005588919 nova_compute[225855]: 2026-01-20 15:23:17.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:18.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:18 np0005588919 nova_compute[225855]: 2026-01-20 15:23:18.329 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:19Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:0a:92 10.100.0.12
Jan 20 10:23:19 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:19Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:0a:92 10.100.0.12
Jan 20 10:23:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:19.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:20.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:21.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:22.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:22 np0005588919 nova_compute[225855]: 2026-01-20 15:23:22.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:23 np0005588919 nova_compute[225855]: 2026-01-20 15:23:23.332 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:23.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:24.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.053 225859 INFO nova.compute.manager [None req-801be12f-ea4e-427e-afe5-a7bda78ad809 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Get console output#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.058 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.346 225859 DEBUG oslo_concurrency.lockutils [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.347 225859 DEBUG oslo_concurrency.lockutils [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.347 225859 DEBUG nova.compute.manager [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.351 225859 DEBUG nova.compute.manager [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.352 225859 DEBUG nova.objects.instance [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'flavor' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:25 np0005588919 nova_compute[225855]: 2026-01-20 15:23:25.377 225859 DEBUG nova.virt.libvirt.driver [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 20 10:23:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:26.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:27 np0005588919 nova_compute[225855]: 2026-01-20 15:23:27.794 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:27 np0005588919 kernel: tap5ee850cb-50 (unregistering): left promiscuous mode
Jan 20 10:23:27 np0005588919 NetworkManager[49104]: <info>  [1768922607.9866] device (tap5ee850cb-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:23:27 np0005588919 nova_compute[225855]: 2026-01-20 15:23:27.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:27Z|00899|binding|INFO|Releasing lport 5ee850cb-507f-4288-993f-a7892e9285c9 from this chassis (sb_readonly=0)
Jan 20 10:23:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:27Z|00900|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 down in Southbound
Jan 20 10:23:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:27Z|00901|binding|INFO|Removing iface tap5ee850cb-50 ovn-installed in OVS
Jan 20 10:23:27 np0005588919 nova_compute[225855]: 2026-01-20 15:23:27.998 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.004 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:0a:92 10.100.0.12'], port_security=['fa:16:3e:cc:0a:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c926c1a-d5cf-4865-aa57-66b439d115f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f91a697-00ef-4c2f-a4cd-47aa600b459f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a867ed68-782d-4d05-8077-be0278c87405, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5ee850cb-507f-4288-993f-a7892e9285c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.005 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5ee850cb-507f-4288-993f-a7892e9285c9 in datapath 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 unbound from our chassis#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.006 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.007 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7aed325c-e852-4e2c-9a0a-c811e9241873]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.008 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 namespace which is not needed anymore#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.025 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000c8.scope: Deactivated successfully.
Jan 20 10:23:28 np0005588919 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000c8.scope: Consumed 13.599s CPU time.
Jan 20 10:23:28 np0005588919 systemd-machined[194361]: Machine qemu-105-instance-000000c8 terminated.
Jan 20 10:23:28 np0005588919 podman[312232]: 2026-01-20 15:23:28.065914497 +0000 UTC m=+0.106021498 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:23:28 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [NOTICE]   (312043) : haproxy version is 2.8.14-c23fe91
Jan 20 10:23:28 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [NOTICE]   (312043) : path to executable is /usr/sbin/haproxy
Jan 20 10:23:28 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [WARNING]  (312043) : Exiting Master process...
Jan 20 10:23:28 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [ALERT]    (312043) : Current worker (312066) exited with code 143 (Terminated)
Jan 20 10:23:28 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312037]: [WARNING]  (312043) : All workers exited. Exiting... (0)
Jan 20 10:23:28 np0005588919 systemd[1]: libpod-3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30.scope: Deactivated successfully.
Jan 20 10:23:28 np0005588919 podman[312280]: 2026-01-20 15:23:28.14515284 +0000 UTC m=+0.045850251 container died 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:23:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.260 225859 DEBUG nova.compute.manager [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.261 225859 DEBUG oslo_concurrency.lockutils [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.261 225859 DEBUG oslo_concurrency.lockutils [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.261 225859 DEBUG oslo_concurrency.lockutils [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.261 225859 DEBUG nova.compute.manager [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.262 225859 WARNING nova.compute.manager [req-738d96d4-787d-4743-b27e-df0ec24b392b req-5d632645-c5ee-4c0e-8bf2-313b05968bdd 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state active and task_state powering-off.#033[00m
Jan 20 10:23:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30-userdata-shm.mount: Deactivated successfully.
Jan 20 10:23:28 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ba41683a800cd9238c1b5a90bf229c56f6a98cdf524e45addc7193533096210c-merged.mount: Deactivated successfully.
Jan 20 10:23:28 np0005588919 podman[312280]: 2026-01-20 15:23:28.288342138 +0000 UTC m=+0.189039529 container cleanup 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:23:28 np0005588919 systemd[1]: libpod-conmon-3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30.scope: Deactivated successfully.
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 podman[312319]: 2026-01-20 15:23:28.364519843 +0000 UTC m=+0.051734158 container remove 3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.370 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f941d24d-78e7-4b68-a4d1-db1ff88cdec4]: (4, ('Tue Jan 20 03:23:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 (3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30)\n3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30\nTue Jan 20 03:23:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 (3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30)\n3d2e532eac9473a89ae3cd09c7dc89ba78b5562e33899e920a0ad5ee290fae30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.372 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7ea85a-50b4-4fe3-ad93-f279ca7fa76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.373 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8ba0d6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.375 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 kernel: tap6a8ba0d6-60: left promiscuous mode
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.388 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.392 225859 INFO nova.virt.libvirt.driver [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance shutdown successfully after 3 seconds.#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.394 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.397 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb966007-836b-4fb4-9394-2859080276e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.399 225859 INFO nova.virt.libvirt.driver [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance destroyed successfully.#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.399 225859 DEBUG nova.objects.instance [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'numa_topology' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.412 225859 DEBUG nova.compute.manager [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.415 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45f61ebc-f0f5-4b90-ba76-fffc4c66e063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.417 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[246d7cab-b7f9-45df-bb5c-6270675618ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.433 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b109bb4d-292e-40a0-bc52-2bbec4ef7b65]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762280, 'reachable_time': 28195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312339, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.436 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:23:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:28.436 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[e457509e-fbb4-4093-9327-0eb3937ac225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:28 np0005588919 systemd[1]: run-netns-ovnmeta\x2d6a8ba0d6\x2d68f0\x2d4a25\x2d84ff\x2d3685d5b259a6.mount: Deactivated successfully.
Jan 20 10:23:28 np0005588919 nova_compute[225855]: 2026-01-20 15:23:28.450 225859 DEBUG oslo_concurrency.lockutils [None req-9074b4a9-08a3-48d3-a541-d80622072f80 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:29.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:30.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.398 225859 DEBUG nova.compute.manager [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.398 225859 DEBUG oslo_concurrency.lockutils [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.398 225859 DEBUG oslo_concurrency.lockutils [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.399 225859 DEBUG oslo_concurrency.lockutils [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.399 225859 DEBUG nova.compute.manager [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:30 np0005588919 nova_compute[225855]: 2026-01-20 15:23:30.399 225859 WARNING nova.compute.manager [req-54ab5347-995d-4995-8745-6c09279f64c9 req-c331f25d-bab8-45e7-bf15-4868cd7beeaa 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state stopped and task_state None.#033[00m
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.525782) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610525923, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 1817, "num_deletes": 252, "total_data_size": 4124514, "memory_usage": 4181888, "flush_reason": "Manual Compaction"}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610556318, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 2698868, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 72281, "largest_seqno": 74093, "table_properties": {"data_size": 2691378, "index_size": 4368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16277, "raw_average_key_size": 20, "raw_value_size": 2676151, "raw_average_value_size": 3361, "num_data_blocks": 190, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922462, "oldest_key_time": 1768922462, "file_creation_time": 1768922610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 30572 microseconds, and 7033 cpu microseconds.
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.556365) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 2698868 bytes OK
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.556386) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.558714) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.558727) EVENT_LOG_v1 {"time_micros": 1768922610558723, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.558747) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 4116259, prev total WAL file size 4116259, number of live WAL files 2.
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.559946) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(2635KB)], [147(10MB)]
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610560032, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 13196821, "oldest_snapshot_seqno": -1}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9670 keys, 11330445 bytes, temperature: kUnknown
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610712594, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 11330445, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11269239, "index_size": 35941, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24197, "raw_key_size": 254467, "raw_average_key_size": 26, "raw_value_size": 11100844, "raw_average_value_size": 1147, "num_data_blocks": 1363, "num_entries": 9670, "num_filter_entries": 9670, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922610, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.712853) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 11330445 bytes
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.714256) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 86.5 rd, 74.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 10.0 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(9.1) write-amplify(4.2) OK, records in: 10195, records dropped: 525 output_compression: NoCompression
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.714276) EVENT_LOG_v1 {"time_micros": 1768922610714266, "job": 94, "event": "compaction_finished", "compaction_time_micros": 152648, "compaction_time_cpu_micros": 30961, "output_level": 6, "num_output_files": 1, "total_output_size": 11330445, "num_input_records": 10195, "num_output_records": 9670, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610714752, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922610716453, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.559761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.716511) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.716515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.716517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.716519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:30 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:23:30.716520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.495 225859 INFO nova.compute.manager [None req-91115158-c844-4da1-8cae-4da48e6711ee 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Get console output#033[00m
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.657 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'flavor' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.678 225859 DEBUG oslo_concurrency.lockutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.679 225859 DEBUG oslo_concurrency.lockutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquired lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.679 225859 DEBUG nova.network.neutron [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:23:31 np0005588919 nova_compute[225855]: 2026-01-20 15:23:31.679 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'info_cache' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:31.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:32.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:32 np0005588919 nova_compute[225855]: 2026-01-20 15:23:32.796 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.134 225859 DEBUG nova.network.neutron [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.229 225859 DEBUG oslo_concurrency.lockutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Releasing lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.250 225859 INFO nova.virt.libvirt.driver [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance destroyed successfully.#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.250 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'numa_topology' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.283 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.335 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.699 225859 DEBUG nova.virt.libvirt.vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:23:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:23:28Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.700 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.701 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.701 225859 DEBUG os_vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.702 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.703 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5ee850cb-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.704 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.706 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.709 225859 INFO os_vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50')#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.715 225859 DEBUG nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start _get_guest_xml network_info=[{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.718 225859 WARNING nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.726 225859 DEBUG nova.virt.libvirt.host [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.727 225859 DEBUG nova.virt.libvirt.host [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.731 225859 DEBUG nova.virt.libvirt.host [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.732 225859 DEBUG nova.virt.libvirt.host [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.733 225859 DEBUG nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.733 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.734 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.735 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.735 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.735 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.735 225859 DEBUG nova.virt.hardware [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.735 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:33 np0005588919 nova_compute[225855]: 2026-01-20 15:23:33.767 225859 DEBUG oslo_concurrency.processutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:33.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:23:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/156660179' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.210 225859 DEBUG oslo_concurrency.processutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:34.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.260 225859 DEBUG oslo_concurrency.processutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:23:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:23:34 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/316765263' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.716 225859 DEBUG oslo_concurrency.processutils [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.718 225859 DEBUG nova.virt.libvirt.vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:23:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:23:28Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.718 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.719 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:34 np0005588919 nova_compute[225855]: 2026-01-20 15:23:34.720 225859 DEBUG nova.objects.instance [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.055 225859 DEBUG nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <uuid>4c926c1a-d5cf-4865-aa57-66b439d115f8</uuid>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <name>instance-000000c8</name>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1485210936</nova:name>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:23:33</nova:creationTime>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:user uuid="442a7a5cb8ea426a82be9762b262d171">tempest-TestNetworkAdvancedServerOps-175282664-project-member</nova:user>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:project uuid="1ed5feeeafe7448a8efb47ab975b0ead">tempest-TestNetworkAdvancedServerOps-175282664</nova:project>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <nova:port uuid="5ee850cb-507f-4288-993f-a7892e9285c9">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="serial">4c926c1a-d5cf-4865-aa57-66b439d115f8</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="uuid">4c926c1a-d5cf-4865-aa57-66b439d115f8</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4c926c1a-d5cf-4865-aa57-66b439d115f8_disk">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/4c926c1a-d5cf-4865-aa57-66b439d115f8_disk.config">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:cc:0a:92"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <target dev="tap5ee850cb-50"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8/console.log" append="off"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <input type="keyboard" bus="usb"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:23:35 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:23:35 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:23:35 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:23:35 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.058 225859 DEBUG nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] skipping disk for instance-000000c8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.058 225859 DEBUG nova.virt.libvirt.driver [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] skipping disk for instance-000000c8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.059 225859 DEBUG nova.virt.libvirt.vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:23:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:23:28Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.059 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.060 225859 DEBUG nova.network.os_vif_util [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.060 225859 DEBUG os_vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.061 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.062 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.064 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.064 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5ee850cb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.065 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5ee850cb-50, col_values=(('external_ids', {'iface-id': '5ee850cb-507f-4288-993f-a7892e9285c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:0a:92', 'vm-uuid': '4c926c1a-d5cf-4865-aa57-66b439d115f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.067 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.0679] manager: (tap5ee850cb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.072 225859 INFO os_vif [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50')#033[00m
Jan 20 10:23:35 np0005588919 kernel: tap5ee850cb-50: entered promiscuous mode
Jan 20 10:23:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:35Z|00902|binding|INFO|Claiming lport 5ee850cb-507f-4288-993f-a7892e9285c9 for this chassis.
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.137 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:35Z|00903|binding|INFO|5ee850cb-507f-4288-993f-a7892e9285c9: Claiming fa:16:3e:cc:0a:92 10.100.0.12
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.1385] manager: (tap5ee850cb-50): new Tun device (/org/freedesktop/NetworkManager/Devices/379)
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.145 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:0a:92 10.100.0.12'], port_security=['fa:16:3e:cc:0a:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c926c1a-d5cf-4865-aa57-66b439d115f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6f91a697-00ef-4c2f-a4cd-47aa600b459f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a867ed68-782d-4d05-8077-be0278c87405, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5ee850cb-507f-4288-993f-a7892e9285c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.146 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5ee850cb-507f-4288-993f-a7892e9285c9 in datapath 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 bound to our chassis#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.147 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6#033[00m
Jan 20 10:23:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:35Z|00904|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 ovn-installed in OVS
Jan 20 10:23:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:35Z|00905|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 up in Southbound
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.157 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.157 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ca245-6d71-4ace-9595-662246f53765]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.158 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a8ba0d6-61 in ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.160 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a8ba0d6-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.160 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[344387be-9605-43e9-a731-7320b0ab2e56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.161 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.162 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bce71c9f-00e7-412d-b583-8dae9ff91e9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 systemd-udevd[312420]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.173 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[abd9cd3c-fbc7-4615-ace5-8bee91c27537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 systemd-machined[194361]: New machine qemu-106-instance-000000c8.
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.1823] device (tap5ee850cb-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.1834] device (tap5ee850cb-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.186 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe5a3db-7da4-46b6-b9a3-a9d054cf0062]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 systemd[1]: Started Virtual Machine qemu-106-instance-000000c8.
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.215 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[53a47acd-a1c2-4bee-854c-5a16137a39ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.221 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9afe23-1dff-4e9b-9690-26c69d4a24d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 systemd-udevd[312425]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.2229] manager: (tap6a8ba0d6-60): new Veth device (/org/freedesktop/NetworkManager/Devices/380)
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.251 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ec56fe73-75f5-4261-af7b-f94108af4762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.254 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c4117e10-f131-4fb0-9d4d-6fa779169099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.2757] device (tap6a8ba0d6-60): carrier: link connected
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.281 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ac13f920-520c-45f8-bacd-ebaa02b91a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.301 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b781802-f3f1-4fb3-ae6e-ee5a3937933c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8ba0d6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:82:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 765420, 'reachable_time': 20334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312453, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.314 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5c481f-2939-40dc-bcbf-32f3ecbb9440]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:8288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 765420, 'tstamp': 765420}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312454, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.330 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcf7e0b-e39b-4f25-9e1f-555642cd7378]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8ba0d6-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:82:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 765420, 'reachable_time': 20334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312455, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.364 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[55a65508-7cf4-4ebe-9c2b-7756ecbf4dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.433 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0e53202f-3b68-4a49-a81b-50d9fb73da21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.435 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8ba0d6-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.435 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.436 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a8ba0d6-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 NetworkManager[49104]: <info>  [1768922615.4382] manager: (tap6a8ba0d6-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.437 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 kernel: tap6a8ba0d6-60: entered promiscuous mode
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.440 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.441 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a8ba0d6-60, col_values=(('external_ids', {'iface-id': '37682451-9139-425b-b6d7-1ea83a2306c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.442 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:35Z|00906|binding|INFO|Releasing lport 37682451-9139-425b-b6d7-1ea83a2306c3 from this chassis (sb_readonly=0)
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.444 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.445 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[12a735f1-3714-4f7f-b232-f7acb1f82b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.445 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.pid.haproxy
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:23:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:35.446 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'env', 'PROCESS_TAG=haproxy-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a8ba0d6-68f0-4a25-84ff-3685d5b259a6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.460 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.641 225859 DEBUG nova.virt.libvirt.host [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Removed pending event for 4c926c1a-d5cf-4865-aa57-66b439d115f8 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.642 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922615.640725, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.642 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.645 225859 DEBUG nova.compute.manager [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.649 225859 INFO nova.virt.libvirt.driver [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance rebooted successfully.#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.649 225859 DEBUG nova.compute.manager [None req-17bcfc80-7c1a-4369-9a16-6423cc2819e8 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.673 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.676 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.704 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.704 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922615.6417022, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.704 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Started (Lifecycle Event)#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.724 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:23:35 np0005588919 nova_compute[225855]: 2026-01-20 15:23:35.728 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:23:35 np0005588919 podman[312530]: 2026-01-20 15:23:35.827382955 +0000 UTC m=+0.050105901 container create 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 10:23:35 np0005588919 systemd[1]: Started libpod-conmon-3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c.scope.
Jan 20 10:23:35 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:23:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:35.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:35 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260d7576a6afe1e83d647994bef54bb9257457e8b2f09eff78519d9b0206d51a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:23:35 np0005588919 podman[312530]: 2026-01-20 15:23:35.799827919 +0000 UTC m=+0.022550885 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:23:35 np0005588919 podman[312530]: 2026-01-20 15:23:35.897567969 +0000 UTC m=+0.120290935 container init 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 20 10:23:35 np0005588919 podman[312530]: 2026-01-20 15:23:35.902414118 +0000 UTC m=+0.125137064 container start 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 20 10:23:35 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [NOTICE]   (312549) : New worker (312551) forked
Jan 20 10:23:35 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [NOTICE]   (312549) : Loading success.
Jan 20 10:23:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:36.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.469 225859 DEBUG nova.compute.manager [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.470 225859 DEBUG oslo_concurrency.lockutils [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.471 225859 DEBUG oslo_concurrency.lockutils [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.471 225859 DEBUG oslo_concurrency.lockutils [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.471 225859 DEBUG nova.compute.manager [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:37 np0005588919 nova_compute[225855]: 2026-01-20 15:23:37.472 225859 WARNING nova.compute.manager [req-d4617dd3-5b54-4a10-a5cd-6fdc9988b37c req-6c9856fc-352d-4360-a66e-2ba109d7c451 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:23:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:37.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:38 np0005588919 podman[312561]: 2026-01-20 15:23:38.014125099 +0000 UTC m=+0.056390641 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:23:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:38.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:38 np0005588919 nova_compute[225855]: 2026-01-20 15:23:38.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.579 225859 DEBUG nova.compute.manager [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.580 225859 DEBUG oslo_concurrency.lockutils [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.580 225859 DEBUG oslo_concurrency.lockutils [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.580 225859 DEBUG oslo_concurrency.lockutils [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.581 225859 DEBUG nova.compute.manager [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:39 np0005588919 nova_compute[225855]: 2026-01-20 15:23:39.581 225859 WARNING nova.compute.manager [req-8fa989ab-d9f3-4367-b0c0-a878f94dfda0 req-f98e48e4-dfd5-485c-94a7-df76fc3140b0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:23:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:39.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:40 np0005588919 nova_compute[225855]: 2026-01-20 15:23:40.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:40.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:23:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:43 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:23:43 np0005588919 nova_compute[225855]: 2026-01-20 15:23:43.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:45 np0005588919 nova_compute[225855]: 2026-01-20 15:23:45.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:46.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:48Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:0a:92 10.100.0.12
Jan 20 10:23:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:48.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:48 np0005588919 nova_compute[225855]: 2026-01-20 15:23:48.342 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:23:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:23:50 np0005588919 nova_compute[225855]: 2026-01-20 15:23:50.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:23:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:23:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:51.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:23:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:52.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:53 np0005588919 nova_compute[225855]: 2026-01-20 15:23:53.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:54 np0005588919 nova_compute[225855]: 2026-01-20 15:23:54.061 225859 INFO nova.compute.manager [None req-c9455216-e9f0-46e5-bfc1-801dfa3d54cd 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Get console output#033[00m
Jan 20 10:23:54 np0005588919 nova_compute[225855]: 2026-01-20 15:23:54.069 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:23:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:55 np0005588919 nova_compute[225855]: 2026-01-20 15:23:55.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:55.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.155 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:56.155 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:56.157 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:23:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:56.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.508 225859 DEBUG nova.compute.manager [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.508 225859 DEBUG nova.compute.manager [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing instance network info cache due to event network-changed-5ee850cb-507f-4288-993f-a7892e9285c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.509 225859 DEBUG oslo_concurrency.lockutils [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.509 225859 DEBUG oslo_concurrency.lockutils [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.509 225859 DEBUG nova.network.neutron [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Refreshing network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.642 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.643 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.643 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.644 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.644 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.645 225859 INFO nova.compute.manager [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Terminating instance#033[00m
Jan 20 10:23:56 np0005588919 nova_compute[225855]: 2026-01-20 15:23:56.647 225859 DEBUG nova.compute.manager [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:23:56 np0005588919 kernel: tap5ee850cb-50 (unregistering): left promiscuous mode
Jan 20 10:23:56 np0005588919 NetworkManager[49104]: <info>  [1768922636.9752] device (tap5ee850cb-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:57Z|00907|binding|INFO|Releasing lport 5ee850cb-507f-4288-993f-a7892e9285c9 from this chassis (sb_readonly=0)
Jan 20 10:23:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:57Z|00908|binding|INFO|Setting lport 5ee850cb-507f-4288-993f-a7892e9285c9 down in Southbound
Jan 20 10:23:57 np0005588919 ovn_controller[130490]: 2026-01-20T15:23:57Z|00909|binding|INFO|Removing iface tap5ee850cb-50 ovn-installed in OVS
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.032 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.037 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:0a:92 10.100.0.12'], port_security=['fa:16:3e:cc:0a:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4c926c1a-d5cf-4865-aa57-66b439d115f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ed5feeeafe7448a8efb47ab975b0ead', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6f91a697-00ef-4c2f-a4cd-47aa600b459f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a867ed68-782d-4d05-8077-be0278c87405, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=5ee850cb-507f-4288-993f-a7892e9285c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.038 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 5ee850cb-507f-4288-993f-a7892e9285c9 in datapath 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 unbound from our chassis#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.040 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.041 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9b6210-77c0-4672-8127-1f56bf31bf33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.042 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 namespace which is not needed anymore#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000c8.scope: Deactivated successfully.
Jan 20 10:23:57 np0005588919 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000c8.scope: Consumed 13.566s CPU time.
Jan 20 10:23:57 np0005588919 systemd-machined[194361]: Machine qemu-106-instance-000000c8 terminated.
Jan 20 10:23:57 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [NOTICE]   (312549) : haproxy version is 2.8.14-c23fe91
Jan 20 10:23:57 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [NOTICE]   (312549) : path to executable is /usr/sbin/haproxy
Jan 20 10:23:57 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [WARNING]  (312549) : Exiting Master process...
Jan 20 10:23:57 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [ALERT]    (312549) : Current worker (312551) exited with code 143 (Terminated)
Jan 20 10:23:57 np0005588919 neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6[312545]: [WARNING]  (312549) : All workers exited. Exiting... (0)
Jan 20 10:23:57 np0005588919 systemd[1]: libpod-3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c.scope: Deactivated successfully.
Jan 20 10:23:57 np0005588919 podman[312845]: 2026-01-20 15:23:57.206068342 +0000 UTC m=+0.047138597 container died 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:23:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c-userdata-shm.mount: Deactivated successfully.
Jan 20 10:23:57 np0005588919 systemd[1]: var-lib-containers-storage-overlay-260d7576a6afe1e83d647994bef54bb9257457e8b2f09eff78519d9b0206d51a-merged.mount: Deactivated successfully.
Jan 20 10:23:57 np0005588919 podman[312845]: 2026-01-20 15:23:57.246311471 +0000 UTC m=+0.087381706 container cleanup 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:23:57 np0005588919 systemd[1]: libpod-conmon-3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c.scope: Deactivated successfully.
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.283 225859 INFO nova.virt.libvirt.driver [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Instance destroyed successfully.#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.284 225859 DEBUG nova.objects.instance [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lazy-loading 'resources' on Instance uuid 4c926c1a-d5cf-4865-aa57-66b439d115f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.319 225859 DEBUG nova.virt.libvirt.vif [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1485210936',display_name='tempest-TestNetworkAdvancedServerOps-server-1485210936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1485210936',id=200,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFcP4UZM8O63AIoOVMDQxqy+5NyOff2+4sn+/fo/i9F+a5f/348af0mVz9/8dEzIolLAAsu+/bRAcsRarrSFS+xLm/g8/3nv4fTZhMf6io52heWTDMNl7eXefIdDoA+XVw==',key_name='tempest-TestNetworkAdvancedServerOps-577536853',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:23:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ed5feeeafe7448a8efb47ab975b0ead',ramdisk_id='',reservation_id='r-y87mmd4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-175282664',owner_user_name='tempest-TestNetworkAdvancedServerOps-175282664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:23:35Z,user_data=None,user_id='442a7a5cb8ea426a82be9762b262d171',uuid=4c926c1a-d5cf-4865-aa57-66b439d115f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.320 225859 DEBUG nova.network.os_vif_util [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converting VIF {"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.321 225859 DEBUG nova.network.os_vif_util [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.322 225859 DEBUG os_vif [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.324 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5ee850cb-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.326 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.331 225859 INFO os_vif [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:0a:92,bridge_name='br-int',has_traffic_filtering=True,id=5ee850cb-507f-4288-993f-a7892e9285c9,network=Network(6a8ba0d6-68f0-4a25-84ff-3685d5b259a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ee850cb-50')#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.408 225859 DEBUG nova.compute.manager [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.408 225859 DEBUG oslo_concurrency.lockutils [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.409 225859 DEBUG oslo_concurrency.lockutils [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.409 225859 DEBUG oslo_concurrency.lockutils [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.410 225859 DEBUG nova.compute.manager [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.410 225859 DEBUG nova.compute.manager [req-4224c930-8fbc-41be-b7ce-efe96bba867f req-c2b2ea51-9e40-41ba-a605-a47c2fe1c36e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-unplugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:23:57 np0005588919 podman[312875]: 2026-01-20 15:23:57.795003617 +0000 UTC m=+0.522480789 container remove 3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.802 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[62470e61-6a4c-4769-be37-3896431c6c4d]: (4, ('Tue Jan 20 03:23:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 (3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c)\n3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c\nTue Jan 20 03:23:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 (3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c)\n3e180e523263b2efe6b366dda5dbd7316202839bfa973cebc4d886e9b759bb3c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.804 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fce504d5-9841-4511-9a5a-216c8404978f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.805 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8ba0d6-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.807 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 kernel: tap6a8ba0d6-60: left promiscuous mode
Jan 20 10:23:57 np0005588919 nova_compute[225855]: 2026-01-20 15:23:57.823 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.827 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f2ecdf-82c2-4705-9029-59acedd65da1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.844 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[63af6880-c62a-4fe7-91a9-f7c037ada8aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.845 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a5918890-bb72-440a-8ae8-c6462a49af31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.860 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ab5146-c2e0-4bb1-bf9e-0cbb611bec92]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 765413, 'reachable_time': 26586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312920, 'error': None, 'target': 'ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 systemd[1]: run-netns-ovnmeta\x2d6a8ba0d6\x2d68f0\x2d4a25\x2d84ff\x2d3685d5b259a6.mount: Deactivated successfully.
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.863 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a8ba0d6-68f0-4a25-84ff-3685d5b259a6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:23:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:23:57.864 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d096177c-bfbe-4ca4-8a36-f6a190c9ff6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:23:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:23:58.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:23:58 np0005588919 nova_compute[225855]: 2026-01-20 15:23:58.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:23:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:23:59 np0005588919 podman[312921]: 2026-01-20 15:23:59.028336079 +0000 UTC m=+0.074652062 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.549 225859 DEBUG nova.compute.manager [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.549 225859 DEBUG oslo_concurrency.lockutils [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.550 225859 DEBUG oslo_concurrency.lockutils [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.550 225859 DEBUG oslo_concurrency.lockutils [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.550 225859 DEBUG nova.compute.manager [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] No waiting events found dispatching network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:23:59 np0005588919 nova_compute[225855]: 2026-01-20 15:23:59.550 225859 WARNING nova.compute.manager [req-47eb6ccc-e7ad-4fb4-bdc1-b25c3095929e req-cfc6fe84-6dd7-4d28-8ac1-b4053358ddbe 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received unexpected event network-vif-plugged-5ee850cb-507f-4288-993f-a7892e9285c9 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:23:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:23:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:23:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:23:59.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.074 225859 DEBUG nova.network.neutron [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updated VIF entry in instance network info cache for port 5ee850cb-507f-4288-993f-a7892e9285c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.075 225859 DEBUG nova.network.neutron [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [{"id": "5ee850cb-507f-4288-993f-a7892e9285c9", "address": "fa:16:3e:cc:0a:92", "network": {"id": "6a8ba0d6-68f0-4a25-84ff-3685d5b259a6", "bridge": "br-int", "label": "tempest-network-smoke--805773012", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ed5feeeafe7448a8efb47ab975b0ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ee850cb-50", "ovs_interfaceid": "5ee850cb-507f-4288-993f-a7892e9285c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.123 225859 INFO nova.virt.libvirt.driver [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Deleting instance files /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8_del#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.124 225859 INFO nova.virt.libvirt.driver [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Deletion of /var/lib/nova/instances/4c926c1a-d5cf-4865-aa57-66b439d115f8_del complete#033[00m
Jan 20 10:24:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:00.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.278 225859 DEBUG oslo_concurrency.lockutils [req-07c3a016-d282-40fa-a837-6e129ccf6ed6 req-44ba922c-40f4-45c1-ac90-364d73b9a8f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-4c926c1a-d5cf-4865-aa57-66b439d115f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.287 225859 INFO nova.compute.manager [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Took 3.64 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.287 225859 DEBUG oslo.service.loopingcall [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.288 225859 DEBUG nova.compute.manager [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:24:00 np0005588919 nova_compute[225855]: 2026-01-20 15:24:00.288 225859 DEBUG nova.network.neutron [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:24:01 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:24:01.158 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:24:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:01.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:02.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.420 225859 DEBUG nova.network.neutron [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.435 225859 INFO nova.compute.manager [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Took 2.15 seconds to deallocate network for instance.#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.478 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.478 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.504 225859 DEBUG nova.compute.manager [req-c6ce2d9a-27e3-480a-868d-fbcdd2877fe2 req-b0327b2a-433b-4fde-afad-5a2318e8125c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Received event network-vif-deleted-5ee850cb-507f-4288-993f-a7892e9285c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.535 225859 DEBUG oslo_concurrency.processutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:24:02 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:24:02 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2239924417' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.966 225859 DEBUG oslo_concurrency.processutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:24:02 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.971 225859 DEBUG nova.compute.provider_tree [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:24:03 np0005588919 nova_compute[225855]: 2026-01-20 15:24:02.999 225859 DEBUG nova.scheduler.client.report [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:24:03 np0005588919 nova_compute[225855]: 2026-01-20 15:24:03.042 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:03 np0005588919 nova_compute[225855]: 2026-01-20 15:24:03.075 225859 INFO nova.scheduler.client.report [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Deleted allocations for instance 4c926c1a-d5cf-4865-aa57-66b439d115f8#033[00m
Jan 20 10:24:03 np0005588919 nova_compute[225855]: 2026-01-20 15:24:03.157 225859 DEBUG oslo_concurrency.lockutils [None req-397d1304-7c92-4fb0-b90e-607b141a4cf2 442a7a5cb8ea426a82be9762b262d171 1ed5feeeafe7448a8efb47ab975b0ead - - default default] Lock "4c926c1a-d5cf-4865-aa57-66b439d115f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:03 np0005588919 nova_compute[225855]: 2026-01-20 15:24:03.347 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:05 np0005588919 nova_compute[225855]: 2026-01-20 15:24:05.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:05 np0005588919 nova_compute[225855]: 2026-01-20 15:24:05.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:06.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:06 np0005588919 nova_compute[225855]: 2026-01-20 15:24:06.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:06 np0005588919 nova_compute[225855]: 2026-01-20 15:24:06.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:24:06 np0005588919 nova_compute[225855]: 2026-01-20 15:24:06.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:24:06 np0005588919 nova_compute[225855]: 2026-01-20 15:24:06.427 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:24:07 np0005588919 nova_compute[225855]: 2026-01-20 15:24:07.430 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:08.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:08 np0005588919 nova_compute[225855]: 2026-01-20 15:24:08.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:08 np0005588919 nova_compute[225855]: 2026-01-20 15:24:08.349 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:09 np0005588919 podman[313028]: 2026-01-20 15:24:09.008581746 +0000 UTC m=+0.055075553 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.377 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:24:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:24:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/960585230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.806 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:24:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:09.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.948 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.949 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4250MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.949 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:09 np0005588919 nova_compute[225855]: 2026-01-20 15:24:09.950 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.017 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.018 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.033 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.050 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.051 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.078 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.103 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.121 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:24:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:10.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:10 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:24:10 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/110819138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.567 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.572 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.597 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.617 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:24:10 np0005588919 nova_compute[225855]: 2026-01-20 15:24:10.618 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:11 np0005588919 nova_compute[225855]: 2026-01-20 15:24:11.618 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:11 np0005588919 nova_compute[225855]: 2026-01-20 15:24:11.619 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:11.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:12.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:12 np0005588919 nova_compute[225855]: 2026-01-20 15:24:12.282 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922637.2804992, 4c926c1a-d5cf-4865-aa57-66b439d115f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:24:12 np0005588919 nova_compute[225855]: 2026-01-20 15:24:12.282 225859 INFO nova.compute.manager [-] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:24:12 np0005588919 nova_compute[225855]: 2026-01-20 15:24:12.328 225859 DEBUG nova.compute.manager [None req-fdbb8f95-eee1-4bb6-9e85-e0150eda8ccd - - - - - -] [instance: 4c926c1a-d5cf-4865-aa57-66b439d115f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:24:12 np0005588919 nova_compute[225855]: 2026-01-20 15:24:12.435 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:13 np0005588919 nova_compute[225855]: 2026-01-20 15:24:13.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:13 np0005588919 nova_compute[225855]: 2026-01-20 15:24:13.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:13 np0005588919 nova_compute[225855]: 2026-01-20 15:24:13.352 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:13.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:14.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:14 np0005588919 nova_compute[225855]: 2026-01-20 15:24:14.349 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:15.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:16.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:24:16.445 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:24:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:24:16.445 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:24:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:24:16.445 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:24:17 np0005588919 nova_compute[225855]: 2026-01-20 15:24:17.439 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:17.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:18.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:18 np0005588919 nova_compute[225855]: 2026-01-20 15:24:18.353 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:19.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:20.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:21.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:22.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:22 np0005588919 nova_compute[225855]: 2026-01-20 15:24:22.494 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:23 np0005588919 nova_compute[225855]: 2026-01-20 15:24:23.355 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:23.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:24.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:25.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:26.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:27 np0005588919 nova_compute[225855]: 2026-01-20 15:24:27.499 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:27.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:28.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:28 np0005588919 nova_compute[225855]: 2026-01-20 15:24:28.357 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:29.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:30 np0005588919 podman[313153]: 2026-01-20 15:24:30.036832188 +0000 UTC m=+0.087874340 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:24:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:30.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:30 np0005588919 nova_compute[225855]: 2026-01-20 15:24:30.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:30 np0005588919 nova_compute[225855]: 2026-01-20 15:24:30.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:24:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:31.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:32.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:32 np0005588919 nova_compute[225855]: 2026-01-20 15:24:32.501 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:33 np0005588919 nova_compute[225855]: 2026-01-20 15:24:33.360 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:34.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:35.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:36.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:37 np0005588919 nova_compute[225855]: 2026-01-20 15:24:37.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:38.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:38 np0005588919 nova_compute[225855]: 2026-01-20 15:24:38.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:39.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:40 np0005588919 podman[313184]: 2026-01-20 15:24:40.012695001 +0000 UTC m=+0.056291929 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:24:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:40.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:41.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:42.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:42 np0005588919 nova_compute[225855]: 2026-01-20 15:24:42.352 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:24:42 np0005588919 nova_compute[225855]: 2026-01-20 15:24:42.352 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:24:42 np0005588919 nova_compute[225855]: 2026-01-20 15:24:42.369 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:24:42 np0005588919 nova_compute[225855]: 2026-01-20 15:24:42.508 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:43 np0005588919 nova_compute[225855]: 2026-01-20 15:24:43.364 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:43.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:44.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:46.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:46.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:46 np0005588919 ovn_controller[130490]: 2026-01-20T15:24:46Z|00910|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Jan 20 10:24:47 np0005588919 nova_compute[225855]: 2026-01-20 15:24:47.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:48.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:48.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:48 np0005588919 nova_compute[225855]: 2026-01-20 15:24:48.365 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:24:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:24:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:50.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:24:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:24:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:52.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:52 np0005588919 nova_compute[225855]: 2026-01-20 15:24:52.516 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:53 np0005588919 nova_compute[225855]: 2026-01-20 15:24:53.367 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:24:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:54.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:24:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:56.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:24:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:56.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:57 np0005588919 nova_compute[225855]: 2026-01-20 15:24:57.519 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:24:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:24:58.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:24:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:24:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:24:58.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:24:58 np0005588919 nova_compute[225855]: 2026-01-20 15:24:58.368 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:24:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:00.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:00.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:01 np0005588919 podman[313446]: 2026-01-20 15:25:01.066573512 +0000 UTC m=+0.116492036 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:25:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:02.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:02.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:02 np0005588919 nova_compute[225855]: 2026-01-20 15:25:02.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:03 np0005588919 nova_compute[225855]: 2026-01-20 15:25:03.357 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:03 np0005588919 nova_compute[225855]: 2026-01-20 15:25:03.357 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:25:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:03.390 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:25:03 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:03.391 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:25:03 np0005588919 nova_compute[225855]: 2026-01-20 15:25:03.414 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:04.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:04.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:06.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:06.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:07 np0005588919 nova_compute[225855]: 2026-01-20 15:25:07.525 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:08.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:08.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:08 np0005588919 nova_compute[225855]: 2026-01-20 15:25:08.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:08 np0005588919 nova_compute[225855]: 2026-01-20 15:25:08.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:25:08 np0005588919 nova_compute[225855]: 2026-01-20 15:25:08.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:25:08 np0005588919 nova_compute[225855]: 2026-01-20 15:25:08.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.343 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.343 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.343 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.344 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.369 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.370 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.370 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:25:09 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4144926388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.819 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.964 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.966 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4274MB free_disk=20.977684020996094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.966 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:09 np0005588919 nova_compute[225855]: 2026-01-20 15:25:09.966 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:10.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:10 np0005588919 podman[313550]: 2026-01-20 15:25:10.996379378 +0000 UTC m=+0.045110459 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:25:11 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:11.392 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:11 np0005588919 nova_compute[225855]: 2026-01-20 15:25:11.576 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:25:11 np0005588919 nova_compute[225855]: 2026-01-20 15:25:11.576 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:25:11 np0005588919 nova_compute[225855]: 2026-01-20 15:25:11.620 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1344433681' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.066 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.072 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.119 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.121 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.121 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:12 np0005588919 nova_compute[225855]: 2026-01-20 15:25:12.528 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:12.992007) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922712992077, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1228, "num_deletes": 256, "total_data_size": 2642387, "memory_usage": 2679248, "flush_reason": "Manual Compaction"}
Jan 20 10:25:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713007486, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1732615, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74098, "largest_seqno": 75321, "table_properties": {"data_size": 1727336, "index_size": 2738, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11445, "raw_average_key_size": 19, "raw_value_size": 1716618, "raw_average_value_size": 2919, "num_data_blocks": 122, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922611, "oldest_key_time": 1768922611, "file_creation_time": 1768922712, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 15564 microseconds, and 4276 cpu microseconds.
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.007571) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1732615 bytes OK
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.007588) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010502) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010555) EVENT_LOG_v1 {"time_micros": 1768922713010544, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.010588) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2636499, prev total WAL file size 2653148, number of live WAL files 2.
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.011589) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373731' seq:72057594037927935, type:22 .. '6C6F676D0033303233' seq:0, type:0; will stop at (end)
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1692KB)], [150(10MB)]
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713011634, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 13063060, "oldest_snapshot_seqno": -1}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9733 keys, 12927504 bytes, temperature: kUnknown
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713160188, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 12927504, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12863962, "index_size": 38085, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24389, "raw_key_size": 256733, "raw_average_key_size": 26, "raw_value_size": 12692578, "raw_average_value_size": 1304, "num_data_blocks": 1454, "num_entries": 9733, "num_filter_entries": 9733, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922713, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.160782) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 12927504 bytes
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.163287) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.8 rd, 86.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.8 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(15.0) write-amplify(7.5) OK, records in: 10258, records dropped: 525 output_compression: NoCompression
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.163322) EVENT_LOG_v1 {"time_micros": 1768922713163306, "job": 96, "event": "compaction_finished", "compaction_time_micros": 148830, "compaction_time_cpu_micros": 34784, "output_level": 6, "num_output_files": 1, "total_output_size": 12927504, "num_input_records": 10258, "num_output_records": 9733, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713164169, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922713170562, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.011493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.170644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.170654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.170659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.170664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:25:13.170668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:25:13 np0005588919 nova_compute[225855]: 2026-01-20 15:25:13.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:14.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:14 np0005588919 nova_compute[225855]: 2026-01-20 15:25:14.117 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:14 np0005588919 nova_compute[225855]: 2026-01-20 15:25:14.117 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:14 np0005588919 nova_compute[225855]: 2026-01-20 15:25:14.117 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:14.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:16.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:16 np0005588919 nova_compute[225855]: 2026-01-20 15:25:16.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:16.445 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:16.446 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:16.446 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:17 np0005588919 nova_compute[225855]: 2026-01-20 15:25:17.577 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:18.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:18 np0005588919 nova_compute[225855]: 2026-01-20 15:25:18.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:20.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:22.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:22 np0005588919 nova_compute[225855]: 2026-01-20 15:25:22.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:23 np0005588919 nova_compute[225855]: 2026-01-20 15:25:23.422 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:24.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:26.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:26.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:27 np0005588919 nova_compute[225855]: 2026-01-20 15:25:27.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:28.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:28 np0005588919 nova_compute[225855]: 2026-01-20 15:25:28.423 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:30.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:30 np0005588919 nova_compute[225855]: 2026-01-20 15:25:30.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:25:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:30.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:32 np0005588919 podman[313655]: 2026-01-20 15:25:32.01734076 +0000 UTC m=+0.064767920 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 20 10:25:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:32.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:32.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:32 np0005588919 nova_compute[225855]: 2026-01-20 15:25:32.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:33 np0005588919 nova_compute[225855]: 2026-01-20 15:25:33.424 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:34.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:34.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:36.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:36.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:37 np0005588919 nova_compute[225855]: 2026-01-20 15:25:37.587 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:38.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:38.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:38 np0005588919 nova_compute[225855]: 2026-01-20 15:25:38.427 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:40.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:40.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:42 np0005588919 podman[313686]: 2026-01-20 15:25:42.003632012 +0000 UTC m=+0.045190282 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:25:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:42.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:42.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:42 np0005588919 nova_compute[225855]: 2026-01-20 15:25:42.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:43 np0005588919 nova_compute[225855]: 2026-01-20 15:25:43.428 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:44.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:44.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.859 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.860 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.874 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.957 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.958 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.967 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:25:45 np0005588919 nova_compute[225855]: 2026-01-20 15:25:45.968 225859 INFO nova.compute.claims [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.063 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:46.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:46.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:46 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:25:46 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3360934547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.512 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.518 225859 DEBUG nova.compute.provider_tree [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.611 225859 DEBUG nova.scheduler.client.report [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.635 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.636 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.680 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.681 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.703 225859 INFO nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.723 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.808 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.810 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.811 225859 INFO nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Creating image(s)#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.845 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.878 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.904 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.907 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.985 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.986 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.986 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:46 np0005588919 nova_compute[225855]: 2026-01-20 15:25:46.986 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.010 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.014 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eb6ef384-2891-42d0-9059-42b89009b14c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.287 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eb6ef384-2891-42d0-9059-42b89009b14c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.321 225859 DEBUG nova.policy [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.359 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.463 225859 DEBUG nova.objects.instance [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid eb6ef384-2891-42d0-9059-42b89009b14c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.495 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.495 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Ensure instance console log exists: /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.496 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.496 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.496 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:47 np0005588919 nova_compute[225855]: 2026-01-20 15:25:47.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:48.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:48.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:48 np0005588919 nova_compute[225855]: 2026-01-20 15:25:48.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:48 np0005588919 nova_compute[225855]: 2026-01-20 15:25:48.611 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Successfully created port: 423d10be-bf78-43ff-8ae2-812d375ccef8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:25:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:25:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:50.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:25:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:50.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.356 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Successfully updated port: 423d10be-bf78-43ff-8ae2-812d375ccef8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.382 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.383 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.383 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.489 225859 DEBUG nova.compute.manager [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.489 225859 DEBUG nova.compute.manager [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing instance network info cache due to event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.490 225859 DEBUG oslo_concurrency.lockutils [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:25:50 np0005588919 nova_compute[225855]: 2026-01-20 15:25:50.584 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:25:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:52.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.231 225859 DEBUG nova.network.neutron [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.262 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.263 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Instance network_info: |[{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.263 225859 DEBUG oslo_concurrency.lockutils [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.263 225859 DEBUG nova.network.neutron [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.266 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Start _get_guest_xml network_info=[{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.272 225859 WARNING nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.278 225859 DEBUG nova.virt.libvirt.host [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.278 225859 DEBUG nova.virt.libvirt.host [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.281 225859 DEBUG nova.virt.libvirt.host [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.281 225859 DEBUG nova.virt.libvirt.host [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.283 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.283 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.284 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.284 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.284 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.284 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.284 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.285 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.285 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.285 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.285 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.286 225859 DEBUG nova.virt.hardware [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.289 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:52.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.596 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:25:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3613554405' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.750 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.775 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:52 np0005588919 nova_compute[225855]: 2026-01-20 15:25:52.779 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:25:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4237637953' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.271 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.274 225859 DEBUG nova.virt.libvirt.vif [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1635932386',display_name='tempest-TestNetworkBasicOps-server-1635932386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1635932386',id=202,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOxLu/uVu4uuXkZBckB0Jue8mA2XpnPI63IpB2BGooiySZuLgddUiCiwQ3/YqBeUzNGbEuiI4/oWiiYa4zQrQHAa9idheznhVw0kdlFQsBm1hL1vB4bH09utur5br8iaiQ==',key_name='tempest-TestNetworkBasicOps-305263708',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-1wrkal35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:25:46Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=eb6ef384-2891-42d0-9059-42b89009b14c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.275 225859 DEBUG nova.network.os_vif_util [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.276 225859 DEBUG nova.network.os_vif_util [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.278 225859 DEBUG nova.objects.instance [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6ef384-2891-42d0-9059-42b89009b14c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.295 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <uuid>eb6ef384-2891-42d0-9059-42b89009b14c</uuid>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <name>instance-000000ca</name>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-1635932386</nova:name>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:25:52</nova:creationTime>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <nova:port uuid="423d10be-bf78-43ff-8ae2-812d375ccef8">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="serial">eb6ef384-2891-42d0-9059-42b89009b14c</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="uuid">eb6ef384-2891-42d0-9059-42b89009b14c</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eb6ef384-2891-42d0-9059-42b89009b14c_disk">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eb6ef384-2891-42d0-9059-42b89009b14c_disk.config">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:56:58:d6"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <target dev="tap423d10be-bf"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/console.log" append="off"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:25:53 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:25:53 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:25:53 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:25:53 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.297 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Preparing to wait for external event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.297 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.297 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.298 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.298 225859 DEBUG nova.virt.libvirt.vif [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1635932386',display_name='tempest-TestNetworkBasicOps-server-1635932386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1635932386',id=202,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOxLu/uVu4uuXkZBckB0Jue8mA2XpnPI63IpB2BGooiySZuLgddUiCiwQ3/YqBeUzNGbEuiI4/oWiiYa4zQrQHAa9idheznhVw0kdlFQsBm1hL1vB4bH09utur5br8iaiQ==',key_name='tempest-TestNetworkBasicOps-305263708',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-1wrkal35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:25:46Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=eb6ef384-2891-42d0-9059-42b89009b14c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.299 225859 DEBUG nova.network.os_vif_util [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.300 225859 DEBUG nova.network.os_vif_util [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.300 225859 DEBUG os_vif [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.301 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.301 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.302 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.306 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423d10be-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.307 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap423d10be-bf, col_values=(('external_ids', {'iface-id': '423d10be-bf78-43ff-8ae2-812d375ccef8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:58:d6', 'vm-uuid': 'eb6ef384-2891-42d0-9059-42b89009b14c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588919 NetworkManager[49104]: <info>  [1768922753.3099] manager: (tap423d10be-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.316 225859 INFO os_vif [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf')#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.366 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.366 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.367 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:56:58:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.367 225859 INFO nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Using config drive#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.391 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.699 225859 INFO nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Creating config drive at /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.704 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5mk0q491 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.779 225859 DEBUG nova.network.neutron [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updated VIF entry in instance network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.780 225859 DEBUG nova.network.neutron [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.795 225859 DEBUG oslo_concurrency.lockutils [req-9c585c66-3c7a-41a4-b473-d96ab5dfb75f req-7d510cfd-28ba-4e9a-bdd8-45782f43353c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.848 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5mk0q491" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.877 225859 DEBUG nova.storage.rbd_utils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image eb6ef384-2891-42d0-9059-42b89009b14c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:25:53 np0005588919 nova_compute[225855]: 2026-01-20 15:25:53.881 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config eb6ef384-2891-42d0-9059-42b89009b14c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.062 225859 DEBUG oslo_concurrency.processutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config eb6ef384-2891-42d0-9059-42b89009b14c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.063 225859 INFO nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Deleting local config drive /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.1126] manager: (tap423d10be-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Jan 20 10:25:54 np0005588919 kernel: tap423d10be-bf: entered promiscuous mode
Jan 20 10:25:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:54.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.114 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:54Z|00911|binding|INFO|Claiming lport 423d10be-bf78-43ff-8ae2-812d375ccef8 for this chassis.
Jan 20 10:25:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:54Z|00912|binding|INFO|423d10be-bf78-43ff-8ae2-812d375ccef8: Claiming fa:16:3e:56:58:d6 10.100.0.6
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.120 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.134 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:58:d6 10.100.0.6'], port_security=['fa:16:3e:56:58:d6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb6ef384-2891-42d0-9059-42b89009b14c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b3aa186-38a4-4cc6-9399-f535503e9791', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '046cc664-f8d9-4379-b46e-95218c363faa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbfdde0f-6f5b-476d-8d21-7557a37ecad6, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=423d10be-bf78-43ff-8ae2-812d375ccef8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.135 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 423d10be-bf78-43ff-8ae2-812d375ccef8 in datapath 3b3aa186-38a4-4cc6-9399-f535503e9791 bound to our chassis#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.136 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3b3aa186-38a4-4cc6-9399-f535503e9791#033[00m
Jan 20 10:25:54 np0005588919 systemd-machined[194361]: New machine qemu-107-instance-000000ca.
Jan 20 10:25:54 np0005588919 systemd-udevd[314084]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.149 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[14decc33-432c-4697-9001-900a7950ec61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.150 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3b3aa186-31 in ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.152 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3b3aa186-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.152 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[18893a02-d7fe-4f2d-a1f9-53c5b99babf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.153 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[34794a76-7224-4979-b49e-c2fcc7fd7235]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.1648] device (tap423d10be-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.1654] device (tap423d10be-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.165 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[b9af6e6c-873b-4ea8-b5be-22e06abb9940]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:54 np0005588919 systemd[1]: Started Virtual Machine qemu-107-instance-000000ca.
Jan 20 10:25:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:54Z|00913|binding|INFO|Setting lport 423d10be-bf78-43ff-8ae2-812d375ccef8 ovn-installed in OVS
Jan 20 10:25:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:54Z|00914|binding|INFO|Setting lport 423d10be-bf78-43ff-8ae2-812d375ccef8 up in Southbound
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.188 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[40187f24-2d1b-44fd-ad30-be25da512eaf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.215 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[db80be7c-b7d4-4990-9014-e67db0136e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.220 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb4839f-3ddd-45f9-be87-bf2748c80a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.2211] manager: (tap3b3aa186-30): new Veth device (/org/freedesktop/NetworkManager/Devices/384)
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.252 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[98e5a7fa-2183-45aa-93cf-11cd1df6a4ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.254 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae56565-5fc7-4add-a5ad-b362895a3415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.2760] device (tap3b3aa186-30): carrier: link connected
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.282 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4e2ba4-5f7a-4c71-97ae-17c02b88e442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.299 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[36ab1ed8-96a2-4c41-841e-60275277f987]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b3aa186-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:01:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779320, 'reachable_time': 26479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314116, 'error': None, 'target': 'ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.313 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2ef450-82ef-42e9-8cdf-eb74f94ddd11]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:19f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 779320, 'tstamp': 779320}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314117, 'error': None, 'target': 'ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.327 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e9889c67-73ed-477e-a3e1-5eaf9eeaa47c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b3aa186-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:01:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779320, 'reachable_time': 26479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 314118, 'error': None, 'target': 'ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.356 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[800826ce-ef11-481c-8de1-b4349fe53480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:25:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:54.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.411 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2e433b82-293d-4d35-bbe0-f6c27c4176a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.412 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b3aa186-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.413 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:25:54 np0005588919 kernel: tap3b3aa186-30: entered promiscuous mode
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.413 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b3aa186-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:54 np0005588919 NetworkManager[49104]: <info>  [1768922754.4167] manager: (tap3b3aa186-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.417 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3b3aa186-30, col_values=(('external_ids', {'iface-id': 'f92b8596-5a2a-495f-b715-08c8aa7181a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:25:54 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:54Z|00915|binding|INFO|Releasing lport f92b8596-5a2a-495f-b715-08c8aa7181a3 from this chassis (sb_readonly=0)
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.419 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3b3aa186-38a4-4cc6-9399-f535503e9791.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3b3aa186-38a4-4cc6-9399-f535503e9791.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.420 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f490e6e-1fcc-4535-a207-a05020e25e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.420 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-3b3aa186-38a4-4cc6-9399-f535503e9791
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/3b3aa186-38a4-4cc6-9399-f535503e9791.pid.haproxy
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 3b3aa186-38a4-4cc6-9399-f535503e9791
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:25:54 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:25:54.421 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791', 'env', 'PROCESS_TAG=haproxy-3b3aa186-38a4-4cc6-9399-f535503e9791', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3b3aa186-38a4-4cc6-9399-f535503e9791.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.724 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922754.7236478, eb6ef384-2891-42d0-9059-42b89009b14c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.724 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.756 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.760 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922754.7238882, eb6ef384-2891-42d0-9059-42b89009b14c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.760 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.777 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.780 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:25:54 np0005588919 nova_compute[225855]: 2026-01-20 15:25:54.796 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:25:54 np0005588919 podman[314191]: 2026-01-20 15:25:54.832127981 +0000 UTC m=+0.044839922 container create c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:25:54 np0005588919 systemd[1]: Started libpod-conmon-c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1.scope.
Jan 20 10:25:54 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:25:54 np0005588919 podman[314191]: 2026-01-20 15:25:54.80865495 +0000 UTC m=+0.021366891 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:25:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42f6bc1236f352acdb0936adef4a9dad7a0a958fe3cb77609aa75d071a1c9c47/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:25:54 np0005588919 podman[314191]: 2026-01-20 15:25:54.915088999 +0000 UTC m=+0.127800950 container init c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:25:54 np0005588919 podman[314191]: 2026-01-20 15:25:54.920442022 +0000 UTC m=+0.133153963 container start c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:25:54 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [NOTICE]   (314211) : New worker (314213) forked
Jan 20 10:25:54 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [NOTICE]   (314211) : Loading success.
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.394 225859 DEBUG nova.compute.manager [req-3737f890-f120-404f-add9-c00713588786 req-1f8d0ccc-c8c4-472f-9b09-a8a9f181c1d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.395 225859 DEBUG oslo_concurrency.lockutils [req-3737f890-f120-404f-add9-c00713588786 req-1f8d0ccc-c8c4-472f-9b09-a8a9f181c1d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.395 225859 DEBUG oslo_concurrency.lockutils [req-3737f890-f120-404f-add9-c00713588786 req-1f8d0ccc-c8c4-472f-9b09-a8a9f181c1d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.396 225859 DEBUG oslo_concurrency.lockutils [req-3737f890-f120-404f-add9-c00713588786 req-1f8d0ccc-c8c4-472f-9b09-a8a9f181c1d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.396 225859 DEBUG nova.compute.manager [req-3737f890-f120-404f-add9-c00713588786 req-1f8d0ccc-c8c4-472f-9b09-a8a9f181c1d7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Processing event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.397 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.400 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922755.4000478, eb6ef384-2891-42d0-9059-42b89009b14c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.400 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.401 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.405 225859 INFO nova.virt.libvirt.driver [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Instance spawned successfully.#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.405 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.419 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.424 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.427 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.427 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.428 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.428 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.429 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.429 225859 DEBUG nova.virt.libvirt.driver [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.451 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.492 225859 INFO nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Took 8.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.492 225859 DEBUG nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.563 225859 INFO nova.compute.manager [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Took 9.65 seconds to build instance.#033[00m
Jan 20 10:25:55 np0005588919 nova_compute[225855]: 2026-01-20 15:25:55.583 225859 DEBUG oslo_concurrency.lockutils [None req-431cd767-f2b3-4d69-b7d3-f2145bbc2389 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:56.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:56.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.499 225859 DEBUG nova.compute.manager [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.501 225859 DEBUG oslo_concurrency.lockutils [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.502 225859 DEBUG oslo_concurrency.lockutils [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.503 225859 DEBUG oslo_concurrency.lockutils [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.503 225859 DEBUG nova.compute.manager [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] No waiting events found dispatching network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:25:57 np0005588919 nova_compute[225855]: 2026-01-20 15:25:57.503 225859 WARNING nova.compute.manager [req-13dcadbb-f48e-4007-a694-fb33422f2209 req-922e1ce6-2666-4db9-901e-ce6c8b7cb1d2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received unexpected event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:25:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:25:58.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:58 np0005588919 nova_compute[225855]: 2026-01-20 15:25:58.314 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:25:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:25:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:25:58.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:25:58 np0005588919 nova_compute[225855]: 2026-01-20 15:25:58.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:25:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:25:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:25:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:25:59 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:59Z|00916|binding|INFO|Releasing lport f92b8596-5a2a-495f-b715-08c8aa7181a3 from this chassis (sb_readonly=0)
Jan 20 10:25:59 np0005588919 NetworkManager[49104]: <info>  [1768922759.5439] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.544 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:59 np0005588919 NetworkManager[49104]: <info>  [1768922759.5447] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Jan 20 10:25:59 np0005588919 ovn_controller[130490]: 2026-01-20T15:25:59Z|00917|binding|INFO|Releasing lport f92b8596-5a2a-495f-b715-08c8aa7181a3 from this chassis (sb_readonly=0)
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.576 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.581 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.946 225859 DEBUG nova.compute.manager [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.947 225859 DEBUG nova.compute.manager [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing instance network info cache due to event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.947 225859 DEBUG oslo_concurrency.lockutils [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.947 225859 DEBUG oslo_concurrency.lockutils [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:25:59 np0005588919 nova_compute[225855]: 2026-01-20 15:25:59.947 225859 DEBUG nova.network.neutron [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:26:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:00.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:01 np0005588919 nova_compute[225855]: 2026-01-20 15:26:01.223 225859 DEBUG nova.network.neutron [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updated VIF entry in instance network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:26:01 np0005588919 nova_compute[225855]: 2026-01-20 15:26:01.224 225859 DEBUG nova.network.neutron [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:26:01 np0005588919 nova_compute[225855]: 2026-01-20 15:26:01.271 225859 DEBUG oslo_concurrency.lockutils [req-3f03e048-0bb2-4dae-969e-a466a179cce7 req-8308b5cb-762c-4cf1-a507-a7c0659e9adc 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:26:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:02.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:03 np0005588919 podman[314358]: 2026-01-20 15:26:03.070763264 +0000 UTC m=+0.111240497 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 20 10:26:03 np0005588919 nova_compute[225855]: 2026-01-20 15:26:03.316 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:03 np0005588919 nova_compute[225855]: 2026-01-20 15:26:03.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:04.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:04.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:26:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:26:05 np0005588919 nova_compute[225855]: 2026-01-20 15:26:05.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:05 np0005588919 nova_compute[225855]: 2026-01-20 15:26:05.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:26:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:06.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:06.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:26:07Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:58:d6 10.100.0.6
Jan 20 10:26:07 np0005588919 ovn_controller[130490]: 2026-01-20T15:26:07Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:58:d6 10.100.0.6
Jan 20 10:26:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:08.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:08 np0005588919 nova_compute[225855]: 2026-01-20 15:26:08.318 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:08.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:08 np0005588919 nova_compute[225855]: 2026-01-20 15:26:08.478 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:09 np0005588919 nova_compute[225855]: 2026-01-20 15:26:09.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:09 np0005588919 nova_compute[225855]: 2026-01-20 15:26:09.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:26:09 np0005588919 nova_compute[225855]: 2026-01-20 15:26:09.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:26:10 np0005588919 nova_compute[225855]: 2026-01-20 15:26:10.091 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:26:10 np0005588919 nova_compute[225855]: 2026-01-20 15:26:10.091 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:26:10 np0005588919 nova_compute[225855]: 2026-01-20 15:26:10.092 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:26:10 np0005588919 nova_compute[225855]: 2026-01-20 15:26:10.092 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eb6ef384-2891-42d0-9059-42b89009b14c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:26:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:10.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:10.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.892 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.909 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.910 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.910 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.910 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.911 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.911 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.930 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.930 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.931 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.931 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:26:11 np0005588919 nova_compute[225855]: 2026-01-20 15:26:11.931 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:12.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:26:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1555682087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:26:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:12.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.384 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.461 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ca as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.462 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ca as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:26:12 np0005588919 podman[314510]: 2026-01-20 15:26:12.537836701 +0000 UTC m=+0.093130130 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.600 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.601 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4062MB free_disk=20.95288848876953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.602 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.602 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.671 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance eb6ef384-2891-42d0-9059-42b89009b14c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.671 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.672 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:26:12 np0005588919 nova_compute[225855]: 2026-01-20 15:26:12.731 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:26:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:26:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1246091011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.156 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.162 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.184 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.209 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.210 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.320 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.480 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:13 np0005588919 nova_compute[225855]: 2026-01-20 15:26:13.640 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:14.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:14 np0005588919 nova_compute[225855]: 2026-01-20 15:26:14.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:14.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:15 np0005588919 nova_compute[225855]: 2026-01-20 15:26:15.176 225859 INFO nova.compute.manager [None req-67609110-5db5-4752-b445-88e91749cfc7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Get console output#033[00m
Jan 20 10:26:15 np0005588919 nova_compute[225855]: 2026-01-20 15:26:15.182 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:26:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:16.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:16.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:16.446 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:26:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:16.447 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:26:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:16.447 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:26:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:18.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:18.275 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:26:18 np0005588919 nova_compute[225855]: 2026-01-20 15:26:18.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:18.277 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:26:18 np0005588919 nova_compute[225855]: 2026-01-20 15:26:18.322 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588919 nova_compute[225855]: 2026-01-20 15:26:18.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:26:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:18.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:18 np0005588919 nova_compute[225855]: 2026-01-20 15:26:18.483 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:20.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:20.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:22.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:22.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:23 np0005588919 nova_compute[225855]: 2026-01-20 15:26:23.324 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:23 np0005588919 nova_compute[225855]: 2026-01-20 15:26:23.485 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:24.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:24 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:26:24.278 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:26:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:24.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:26.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:26.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:28.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:28 np0005588919 nova_compute[225855]: 2026-01-20 15:26:28.326 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:28.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:28 np0005588919 nova_compute[225855]: 2026-01-20 15:26:28.487 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:30.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:30.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:32.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:33 np0005588919 nova_compute[225855]: 2026-01-20 15:26:33.328 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:33 np0005588919 nova_compute[225855]: 2026-01-20 15:26:33.490 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:34 np0005588919 podman[314612]: 2026-01-20 15:26:34.022676279 +0000 UTC m=+0.069516426 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 20 10:26:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:34.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:34.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:36.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:36.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:38.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:38 np0005588919 nova_compute[225855]: 2026-01-20 15:26:38.331 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:38 np0005588919 nova_compute[225855]: 2026-01-20 15:26:38.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:40.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:40.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:42.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:42.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:42 np0005588919 podman[314645]: 2026-01-20 15:26:42.997716586 +0000 UTC m=+0.049629628 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:26:43 np0005588919 nova_compute[225855]: 2026-01-20 15:26:43.333 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:43 np0005588919 nova_compute[225855]: 2026-01-20 15:26:43.494 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:44.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:44.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:46.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:46.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:26:47Z|00918|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 20 10:26:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:48.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:48 np0005588919 nova_compute[225855]: 2026-01-20 15:26:48.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:48.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:48 np0005588919 nova_compute[225855]: 2026-01-20 15:26:48.496 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:50.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:50.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:26:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:52.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:26:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:52.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:53 np0005588919 nova_compute[225855]: 2026-01-20 15:26:53.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:53 np0005588919 nova_compute[225855]: 2026-01-20 15:26:53.497 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:54.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:54.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:56.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:56.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:26:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:26:58.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:26:58 np0005588919 nova_compute[225855]: 2026-01-20 15:26:58.337 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:26:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:26:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:26:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:26:58.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:26:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:26:58 np0005588919 nova_compute[225855]: 2026-01-20 15:26:58.499 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:00.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:00.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:02.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:02.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:03 np0005588919 nova_compute[225855]: 2026-01-20 15:27:03.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:03 np0005588919 nova_compute[225855]: 2026-01-20 15:27:03.501 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:04 np0005588919 podman[314825]: 2026-01-20 15:27:04.127143056 +0000 UTC m=+0.072959984 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:27:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:04.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:04.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:05 np0005588919 nova_compute[225855]: 2026-01-20 15:27:05.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:05 np0005588919 nova_compute[225855]: 2026-01-20 15:27:05.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:27:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:27:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:27:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:06.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:06.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:07 np0005588919 nova_compute[225855]: 2026-01-20 15:27:07.829 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:08.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:08 np0005588919 nova_compute[225855]: 2026-01-20 15:27:08.340 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:08.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:08 np0005588919 nova_compute[225855]: 2026-01-20 15:27:08.503 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.638 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.639 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.639 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:27:09 np0005588919 nova_compute[225855]: 2026-01-20 15:27:09.639 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eb6ef384-2891-42d0-9059-42b89009b14c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:27:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:10.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:10.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:10 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:27:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:12.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:12.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.906455) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832906550, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 1383, "num_deletes": 251, "total_data_size": 3165251, "memory_usage": 3223488, "flush_reason": "Manual Compaction"}
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832921506, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 2078124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75326, "largest_seqno": 76704, "table_properties": {"data_size": 2072208, "index_size": 3246, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12612, "raw_average_key_size": 19, "raw_value_size": 2060353, "raw_average_value_size": 3254, "num_data_blocks": 145, "num_entries": 633, "num_filter_entries": 633, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922713, "oldest_key_time": 1768922713, "file_creation_time": 1768922832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 15094 microseconds, and 5423 cpu microseconds.
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.921545) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 2078124 bytes OK
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.921564) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.923371) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.923383) EVENT_LOG_v1 {"time_micros": 1768922832923379, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.923402) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 3158775, prev total WAL file size 3158775, number of live WAL files 2.
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.924053) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(2029KB)], [153(12MB)]
Jan 20 10:27:12 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922832924076, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 15005628, "oldest_snapshot_seqno": -1}
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9851 keys, 13117279 bytes, temperature: kUnknown
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833067517, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 13117279, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13052724, "index_size": 38842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24645, "raw_key_size": 259883, "raw_average_key_size": 26, "raw_value_size": 12879052, "raw_average_value_size": 1307, "num_data_blocks": 1480, "num_entries": 9851, "num_filter_entries": 9851, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.067942) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 13117279 bytes
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.070287) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 104.5 rd, 91.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.3 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(13.5) write-amplify(6.3) OK, records in: 10366, records dropped: 515 output_compression: NoCompression
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.070307) EVENT_LOG_v1 {"time_micros": 1768922833070299, "job": 98, "event": "compaction_finished", "compaction_time_micros": 143581, "compaction_time_cpu_micros": 31236, "output_level": 6, "num_output_files": 1, "total_output_size": 13117279, "num_input_records": 10366, "num_output_records": 9851, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833071030, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922833073098, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:12.924014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.073190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.073197) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.073199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.073201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:27:13.073204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.114 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.143 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.143 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.144 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.144 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.144 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.172 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.172 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.173 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.173 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.173 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.342 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.504 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2380053491' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.647 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.711 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ca as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.711 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ca as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.858 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.859 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4075MB free_disk=20.91363525390625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.860 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:13 np0005588919 nova_compute[225855]: 2026-01-20 15:27:13.860 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:14 np0005588919 podman[315011]: 2026-01-20 15:27:14.00675643 +0000 UTC m=+0.048463764 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.037 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance eb6ef384-2891-42d0-9059-42b89009b14c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.038 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.038 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:27:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:14.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.253 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:14.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1376628842' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.702 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.708 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.740 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.742 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.743 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.941 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:14 np0005588919 nova_compute[225855]: 2026-01-20 15:27:14.941 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:15 np0005588919 nova_compute[225855]: 2026-01-20 15:27:15.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:16.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:16.447 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:16.448 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:16.449 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:18.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:18 np0005588919 nova_compute[225855]: 2026-01-20 15:27:18.359 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:18.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:18 np0005588919 nova_compute[225855]: 2026-01-20 15:27:18.507 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:19 np0005588919 nova_compute[225855]: 2026-01-20 15:27:19.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:20.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:20.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:22.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:27:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 64K writes, 249K keys, 64K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.04 MB/s#012Cumulative WAL: 64K writes, 24K syncs, 2.64 writes per sync, written: 0.24 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4629 writes, 15K keys, 4629 commit groups, 1.0 writes per commit group, ingest: 15.94 MB, 0.03 MB/s#012Interval WAL: 4629 writes, 1847 syncs, 2.51 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:27:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:22.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:23 np0005588919 nova_compute[225855]: 2026-01-20 15:27:23.360 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:23 np0005588919 nova_compute[225855]: 2026-01-20 15:27:23.510 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:24.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:24.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:26.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:26.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:28.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:28 np0005588919 nova_compute[225855]: 2026-01-20 15:27:28.362 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:28.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:28 np0005588919 nova_compute[225855]: 2026-01-20 15:27:28.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:30.090 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:27:30 np0005588919 nova_compute[225855]: 2026-01-20 15:27:30.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:30.091 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:27:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:30.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:30.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:32.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:32.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:33 np0005588919 nova_compute[225855]: 2026-01-20 15:27:33.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:27:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:27:33Z|00919|binding|INFO|Releasing lport f92b8596-5a2a-495f-b715-08c8aa7181a3 from this chassis (sb_readonly=0)
Jan 20 10:27:33 np0005588919 nova_compute[225855]: 2026-01-20 15:27:33.364 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588919 nova_compute[225855]: 2026-01-20 15:27:33.376 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:33 np0005588919 nova_compute[225855]: 2026-01-20 15:27:33.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:34.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:34.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.482 225859 DEBUG nova.compute.manager [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.482 225859 DEBUG nova.compute.manager [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing instance network info cache due to event network-changed-423d10be-bf78-43ff-8ae2-812d375ccef8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.483 225859 DEBUG oslo_concurrency.lockutils [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.483 225859 DEBUG oslo_concurrency.lockutils [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.483 225859 DEBUG nova.network.neutron [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Refreshing network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.547 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.548 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.548 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.548 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.548 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.550 225859 INFO nova.compute.manager [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Terminating instance#033[00m
Jan 20 10:27:34 np0005588919 nova_compute[225855]: 2026-01-20 15:27:34.551 225859 DEBUG nova.compute.manager [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:27:35 np0005588919 podman[315113]: 2026-01-20 15:27:35.036570079 +0000 UTC m=+0.080114518 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:27:35 np0005588919 kernel: tap423d10be-bf (unregistering): left promiscuous mode
Jan 20 10:27:35 np0005588919 NetworkManager[49104]: <info>  [1768922855.3631] device (tap423d10be-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.371 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:27:35Z|00920|binding|INFO|Releasing lport 423d10be-bf78-43ff-8ae2-812d375ccef8 from this chassis (sb_readonly=0)
Jan 20 10:27:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:27:35Z|00921|binding|INFO|Setting lport 423d10be-bf78-43ff-8ae2-812d375ccef8 down in Southbound
Jan 20 10:27:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:27:35Z|00922|binding|INFO|Removing iface tap423d10be-bf ovn-installed in OVS
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.373 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.380 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:58:d6 10.100.0.6'], port_security=['fa:16:3e:56:58:d6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eb6ef384-2891-42d0-9059-42b89009b14c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b3aa186-38a4-4cc6-9399-f535503e9791', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '046cc664-f8d9-4379-b46e-95218c363faa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbfdde0f-6f5b-476d-8d21-7557a37ecad6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=423d10be-bf78-43ff-8ae2-812d375ccef8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.381 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 423d10be-bf78-43ff-8ae2-812d375ccef8 in datapath 3b3aa186-38a4-4cc6-9399-f535503e9791 unbound from our chassis#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.382 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b3aa186-38a4-4cc6-9399-f535503e9791, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.384 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[83682e27-6ea2-43d5-8ab9-bde3b88cb6a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.385 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791 namespace which is not needed anymore#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.391 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000ca.scope: Deactivated successfully.
Jan 20 10:27:35 np0005588919 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000ca.scope: Consumed 16.773s CPU time.
Jan 20 10:27:35 np0005588919 systemd-machined[194361]: Machine qemu-107-instance-000000ca terminated.
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [NOTICE]   (314211) : haproxy version is 2.8.14-c23fe91
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [NOTICE]   (314211) : path to executable is /usr/sbin/haproxy
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [WARNING]  (314211) : Exiting Master process...
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [WARNING]  (314211) : Exiting Master process...
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [ALERT]    (314211) : Current worker (314213) exited with code 143 (Terminated)
Jan 20 10:27:35 np0005588919 neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791[314207]: [WARNING]  (314211) : All workers exited. Exiting... (0)
Jan 20 10:27:35 np0005588919 systemd[1]: libpod-c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1.scope: Deactivated successfully.
Jan 20 10:27:35 np0005588919 podman[315165]: 2026-01-20 15:27:35.51330587 +0000 UTC m=+0.043912564 container died c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:27:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1-userdata-shm.mount: Deactivated successfully.
Jan 20 10:27:35 np0005588919 systemd[1]: var-lib-containers-storage-overlay-42f6bc1236f352acdb0936adef4a9dad7a0a958fe3cb77609aa75d071a1c9c47-merged.mount: Deactivated successfully.
Jan 20 10:27:35 np0005588919 podman[315165]: 2026-01-20 15:27:35.550300647 +0000 UTC m=+0.080907341 container cleanup c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:27:35 np0005588919 systemd[1]: libpod-conmon-c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1.scope: Deactivated successfully.
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.590 225859 INFO nova.virt.libvirt.driver [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Instance destroyed successfully.#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.591 225859 DEBUG nova.objects.instance [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid eb6ef384-2891-42d0-9059-42b89009b14c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.611 225859 DEBUG nova.virt.libvirt.vif [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1635932386',display_name='tempest-TestNetworkBasicOps-server-1635932386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1635932386',id=202,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOxLu/uVu4uuXkZBckB0Jue8mA2XpnPI63IpB2BGooiySZuLgddUiCiwQ3/YqBeUzNGbEuiI4/oWiiYa4zQrQHAa9idheznhVw0kdlFQsBm1hL1vB4bH09utur5br8iaiQ==',key_name='tempest-TestNetworkBasicOps-305263708',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-1wrkal35',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:25:55Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=eb6ef384-2891-42d0-9059-42b89009b14c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.612 225859 DEBUG nova.network.os_vif_util [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.613 225859 DEBUG nova.network.os_vif_util [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.613 225859 DEBUG os_vif [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.615 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.617 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423d10be-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:35 np0005588919 podman[315194]: 2026-01-20 15:27:35.619559254 +0000 UTC m=+0.049935317 container remove c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.622 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.625 225859 INFO os_vif [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:58:d6,bridge_name='br-int',has_traffic_filtering=True,id=423d10be-bf78-43ff-8ae2-812d375ccef8,network=Network(3b3aa186-38a4-4cc6-9399-f535503e9791),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap423d10be-bf')#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.627 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e478625b-a3cc-46ab-9395-982493ef048c]: (4, ('Tue Jan 20 03:27:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791 (c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1)\nc23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1\nTue Jan 20 03:27:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791 (c23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1)\nc23b6ee47ebd8837b0723000e8dd8b5f34925d463ae436fc56e1762a0bc434e1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.629 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[23392637-bd9b-4b91-b859-60ab8962d589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.630 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b3aa186-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:35 np0005588919 kernel: tap3b3aa186-30: left promiscuous mode
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.643 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 nova_compute[225855]: 2026-01-20 15:27:35.646 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.649 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8ee5ee-5ee7-4d1d-9682-0cc850c73d33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.662 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b25f3c-8499-4d5a-9272-bb0e974a3f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.664 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf77e96b-851f-4d08-b1d2-57ac22fc8200]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.683 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[eb14d818-ce6f-4ee2-92e2-1722af9cd7f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779313, 'reachable_time': 25571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315238, 'error': None, 'target': 'ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:35 np0005588919 systemd[1]: run-netns-ovnmeta\x2d3b3aa186\x2d38a4\x2d4cc6\x2d9399\x2df535503e9791.mount: Deactivated successfully.
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.687 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3b3aa186-38a4-4cc6-9399-f535503e9791 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:27:35 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:35.687 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[d706d4cc-6f6f-49b1-a843-07e4f6237cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.040 225859 INFO nova.virt.libvirt.driver [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Deleting instance files /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c_del#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.041 225859 INFO nova.virt.libvirt.driver [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Deletion of /var/lib/nova/instances/eb6ef384-2891-42d0-9059-42b89009b14c_del complete#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.114 225859 INFO nova.compute.manager [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Took 1.56 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.115 225859 DEBUG oslo.service.loopingcall [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.115 225859 DEBUG nova.compute.manager [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.116 225859 DEBUG nova.network.neutron [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:27:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:36.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:36.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.573 225859 DEBUG nova.compute.manager [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-unplugged-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.573 225859 DEBUG oslo_concurrency.lockutils [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.574 225859 DEBUG oslo_concurrency.lockutils [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.574 225859 DEBUG oslo_concurrency.lockutils [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.574 225859 DEBUG nova.compute.manager [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] No waiting events found dispatching network-vif-unplugged-423d10be-bf78-43ff-8ae2-812d375ccef8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:27:36 np0005588919 nova_compute[225855]: 2026-01-20 15:27:36.574 225859 DEBUG nova.compute.manager [req-f0ca6ea0-ec60-4d75-8cdf-217a7948a4dd req-b6d3295f-46ce-49bd-9588-8bab400b4594 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-unplugged-423d10be-bf78-43ff-8ae2-812d375ccef8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:27:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:27:37.092 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:27:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:38.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.347 225859 DEBUG nova.network.neutron [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.369 225859 INFO nova.compute.manager [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Took 2.25 seconds to deallocate network for instance.#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.414 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.415 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.449 225859 DEBUG nova.compute.manager [req-2dbe7723-39ae-477c-9557-c468cd59fe5a req-7b66b752-0f8d-40b8-b903-c1077d1960fb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-deleted-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:38.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.466 225859 DEBUG oslo_concurrency.processutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:27:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.514 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.686 225859 DEBUG nova.compute.manager [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.687 225859 DEBUG oslo_concurrency.lockutils [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.687 225859 DEBUG oslo_concurrency.lockutils [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.687 225859 DEBUG oslo_concurrency.lockutils [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.687 225859 DEBUG nova.compute.manager [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] No waiting events found dispatching network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.688 225859 WARNING nova.compute.manager [req-10b9b0d8-19d5-4431-aed8-6dff23102258 req-e25f0678-7139-49fa-b647-f6c386ff8cc6 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Received unexpected event network-vif-plugged-423d10be-bf78-43ff-8ae2-812d375ccef8 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:27:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:27:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2202569858' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.920 225859 DEBUG nova.network.neutron [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updated VIF entry in instance network info cache for port 423d10be-bf78-43ff-8ae2-812d375ccef8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.921 225859 DEBUG nova.network.neutron [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Updating instance_info_cache with network_info: [{"id": "423d10be-bf78-43ff-8ae2-812d375ccef8", "address": "fa:16:3e:56:58:d6", "network": {"id": "3b3aa186-38a4-4cc6-9399-f535503e9791", "bridge": "br-int", "label": "tempest-network-smoke--458751349", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap423d10be-bf", "ovs_interfaceid": "423d10be-bf78-43ff-8ae2-812d375ccef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.923 225859 DEBUG oslo_concurrency.processutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.929 225859 DEBUG nova.compute.provider_tree [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.945 225859 DEBUG oslo_concurrency.lockutils [req-0ee3aff6-fcab-40bb-ad29-b3eb5bffb864 req-0521a83f-5d85-4b6a-8a7b-aa7b4aa2d7c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eb6ef384-2891-42d0-9059-42b89009b14c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.948 225859 DEBUG nova.scheduler.client.report [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:27:38 np0005588919 nova_compute[225855]: 2026-01-20 15:27:38.969 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:39 np0005588919 nova_compute[225855]: 2026-01-20 15:27:39.001 225859 INFO nova.scheduler.client.report [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance eb6ef384-2891-42d0-9059-42b89009b14c#033[00m
Jan 20 10:27:39 np0005588919 nova_compute[225855]: 2026-01-20 15:27:39.064 225859 DEBUG oslo_concurrency.lockutils [None req-a7e80228-0114-4046-9376-2462bdc6520d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "eb6ef384-2891-42d0-9059-42b89009b14c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:27:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:40.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:40.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:40 np0005588919 nova_compute[225855]: 2026-01-20 15:27:40.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:42.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:43 np0005588919 nova_compute[225855]: 2026-01-20 15:27:43.515 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:44.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:44 np0005588919 nova_compute[225855]: 2026-01-20 15:27:44.467 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:44 np0005588919 nova_compute[225855]: 2026-01-20 15:27:44.554 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:44 np0005588919 podman[315268]: 2026-01-20 15:27:44.999227774 +0000 UTC m=+0.047074505 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:27:45 np0005588919 nova_compute[225855]: 2026-01-20 15:27:45.622 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:46.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:46.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:27:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:48.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:27:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:48.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:48 np0005588919 nova_compute[225855]: 2026-01-20 15:27:48.517 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:50.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:50.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:50 np0005588919 nova_compute[225855]: 2026-01-20 15:27:50.588 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922855.5870845, eb6ef384-2891-42d0-9059-42b89009b14c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:27:50 np0005588919 nova_compute[225855]: 2026-01-20 15:27:50.588 225859 INFO nova.compute.manager [-] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:27:50 np0005588919 nova_compute[225855]: 2026-01-20 15:27:50.616 225859 DEBUG nova.compute.manager [None req-e6cf17ca-7728-4417-8402-3f280ac4bb16 - - - - - -] [instance: eb6ef384-2891-42d0-9059-42b89009b14c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:27:50 np0005588919 nova_compute[225855]: 2026-01-20 15:27:50.624 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:52.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:52.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:53 np0005588919 nova_compute[225855]: 2026-01-20 15:27:53.518 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:54.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:27:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 15K writes, 77K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 15K writes, 15K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1509 writes, 7469 keys, 1509 commit groups, 1.0 writes per commit group, ingest: 15.29 MB, 0.03 MB/s#012Interval WAL: 1509 writes, 1509 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     80.6      1.16              0.32        49    0.024       0      0       0.0       0.0#012  L6      1/0   12.51 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.1    101.5     86.8      5.56              1.53        48    0.116    353K    26K       0.0       0.0#012 Sum      1/0   12.51 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.1     83.9     85.8      6.72              1.85        97    0.069    353K    26K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     78.2     79.6      1.03              0.23        12    0.086     60K   3110       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0    101.5     86.8      5.56              1.53        48    0.116    353K    26K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     80.8      1.16              0.32        48    0.024       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.092, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.56 GB write, 0.11 MB/s write, 0.55 GB read, 0.10 MB/s read, 6.7 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 62.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000375 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3577,59.91 MB,19.7072%) FilterBlock(97,977.48 KB,0.314005%) IndexBlock(97,1.60 MB,0.527126%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:27:55 np0005588919 nova_compute[225855]: 2026-01-20 15:27:55.627 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:27:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:56.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:56.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:27:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:27:58.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:27:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:27:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:27:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:27:58.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:27:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:27:58 np0005588919 nova_compute[225855]: 2026-01-20 15:27:58.520 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:00.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:00.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:00 np0005588919 nova_compute[225855]: 2026-01-20 15:28:00.631 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:02.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:02.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:03 np0005588919 nova_compute[225855]: 2026-01-20 15:28:03.522 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:04.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:04.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:05 np0005588919 nova_compute[225855]: 2026-01-20 15:28:05.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:05 np0005588919 nova_compute[225855]: 2026-01-20 15:28:05.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:28:05 np0005588919 nova_compute[225855]: 2026-01-20 15:28:05.633 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:06 np0005588919 podman[315351]: 2026-01-20 15:28:06.028958078 +0000 UTC m=+0.076063223 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:28:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:06.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:06.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.066 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.066 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.091 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.219 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.220 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.231 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.231 225859 INFO nova.compute.claims [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.338 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.809 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.817 225859 DEBUG nova.compute.provider_tree [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.839 225859 DEBUG nova.scheduler.client.report [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.873 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.874 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.918 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.919 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.939 225859 INFO nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:28:07 np0005588919 nova_compute[225855]: 2026-01-20 15:28:07.957 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.071 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.073 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.073 225859 INFO nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Creating image(s)#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.095 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.121 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.144 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.148 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.215 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.216 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.217 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.217 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.242 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.247 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 770605b0-4686-4d97-9f82-7ed299482f50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:08.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.339 225859 DEBUG nova.policy [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:28:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:08.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.503 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 770605b0-4686-4d97-9f82-7ed299482f50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.256s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.534 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.570 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.657 225859 DEBUG nova.objects.instance [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.673 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.673 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Ensure instance console log exists: /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.674 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.674 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:08 np0005588919 nova_compute[225855]: 2026-01-20 15:28:08.674 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:09 np0005588919 nova_compute[225855]: 2026-01-20 15:28:09.127 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Successfully created port: 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:28:09 np0005588919 nova_compute[225855]: 2026-01-20 15:28:09.934 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Successfully updated port: 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:28:09 np0005588919 nova_compute[225855]: 2026-01-20 15:28:09.953 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:09 np0005588919 nova_compute[225855]: 2026-01-20 15:28:09.953 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:09 np0005588919 nova_compute[225855]: 2026-01-20 15:28:09.953 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:28:10 np0005588919 nova_compute[225855]: 2026-01-20 15:28:10.025 225859 DEBUG nova.compute.manager [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:10 np0005588919 nova_compute[225855]: 2026-01-20 15:28:10.026 225859 DEBUG nova.compute.manager [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing instance network info cache due to event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:28:10 np0005588919 nova_compute[225855]: 2026-01-20 15:28:10.026 225859 DEBUG oslo_concurrency.lockutils [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:10 np0005588919 nova_compute[225855]: 2026-01-20 15:28:10.317 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:28:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:10.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:10.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:10 np0005588919 nova_compute[225855]: 2026-01-20 15:28:10.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.095 225859 DEBUG nova.network.neutron [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.127 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.128 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Instance network_info: |[{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.128 225859 DEBUG oslo_concurrency.lockutils [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.129 225859 DEBUG nova.network.neutron [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.132 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Start _get_guest_xml network_info=[{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.136 225859 WARNING nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.149 225859 DEBUG nova.virt.libvirt.host [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.151 225859 DEBUG nova.virt.libvirt.host [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.156 225859 DEBUG nova.virt.libvirt.host [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.156 225859 DEBUG nova.virt.libvirt.host [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.157 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.157 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.158 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.158 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.158 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.159 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.159 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.159 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.159 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.159 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.160 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.160 225859 DEBUG nova.virt.hardware [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.163 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.359 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.379 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.379 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.380 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.380 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.380 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:28:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/864390893' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.619 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.642 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.648 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3409456068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:11 np0005588919 nova_compute[225855]: 2026-01-20 15:28:11.880 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.067 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.069 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4232MB free_disk=20.97887420654297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.069 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.070 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.160 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 770605b0-4686-4d97-9f82-7ed299482f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.161 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.161 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1982473906' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.206 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.209 225859 DEBUG nova.virt.libvirt.vif [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:28:08Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.209 225859 DEBUG nova.network.os_vif_util [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.210 225859 DEBUG nova.network.os_vif_util [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.212 225859 DEBUG nova.objects.instance [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.214 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.259 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <uuid>770605b0-4686-4d97-9f82-7ed299482f50</uuid>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <name>instance-000000cc</name>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:28:11</nova:creationTime>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="serial">770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="uuid">770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/770605b0-4686-4d97-9f82-7ed299482f50_disk">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/770605b0-4686-4d97-9f82-7ed299482f50_disk.config">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:f9:08:60"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <target dev="tap52fb2315-9e"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log" append="off"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:28:12 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:28:12 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:28:12 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:28:12 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.261 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Preparing to wait for external event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.262 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.262 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.262 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.263 225859 DEBUG nova.virt.libvirt.vif [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:28:08Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.264 225859 DEBUG nova.network.os_vif_util [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.265 225859 DEBUG nova.network.os_vif_util [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.265 225859 DEBUG os_vif [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.266 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.267 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.271 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.271 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52fb2315-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.272 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap52fb2315-9e, col_values=(('external_ids', {'iface-id': '52fb2315-9ec5-47a4-af4a-e0ed5e4caf21', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:08:60', 'vm-uuid': '770605b0-4686-4d97-9f82-7ed299482f50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.273 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:12 np0005588919 NetworkManager[49104]: <info>  [1768922892.2747] manager: (tap52fb2315-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.280 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.281 225859 INFO os_vif [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e')#033[00m
Jan 20 10:28:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:12.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.325 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.325 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.326 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:f9:08:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.326 225859 INFO nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Using config drive#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.355 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:12.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.551 225859 DEBUG nova.network.neutron [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updated VIF entry in instance network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.551 225859 DEBUG nova.network.neutron [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.573 225859 DEBUG oslo_concurrency.lockutils [req-496fb9cd-060b-41a6-a2d4-4d8e4d0a388c req-0056ab06-d98d-423c-ae8c-39f40b4d0bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1083406231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.681 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.686 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.698 225859 INFO nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Creating config drive at /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.703 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4twi5ho1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.729 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.765 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.766 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.834 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4twi5ho1" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.861 225859 DEBUG nova.storage.rbd_utils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 770605b0-4686-4d97-9f82-7ed299482f50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:28:12 np0005588919 nova_compute[225855]: 2026-01-20 15:28:12.864 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config 770605b0-4686-4d97-9f82-7ed299482f50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.032 225859 DEBUG oslo_concurrency.processutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config 770605b0-4686-4d97-9f82-7ed299482f50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.033 225859 INFO nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Deleting local config drive /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/disk.config because it was imported into RBD.#033[00m
Jan 20 10:28:13 np0005588919 kernel: tap52fb2315-9e: entered promiscuous mode
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.0887] manager: (tap52fb2315-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/389)
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.087 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:13Z|00923|binding|INFO|Claiming lport 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 for this chassis.
Jan 20 10:28:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:13Z|00924|binding|INFO|52fb2315-9ec5-47a4-af4a-e0ed5e4caf21: Claiming fa:16:3e:f9:08:60 10.100.0.14
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.102 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:08:60 10.100.0.14'], port_security=['fa:16:3e:f9:08:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '770605b0-4686-4d97-9f82-7ed299482f50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19fb67c-6bab-4253-851e-ede5bb26f589', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b57a9b16-bf8b-47b4-a097-c9a9044c2225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c4c5a39-3036-4b6a-873b-b8673f881902, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.103 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 in datapath f19fb67c-6bab-4253-851e-ede5bb26f589 bound to our chassis#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.104 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f19fb67c-6bab-4253-851e-ede5bb26f589#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.120 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[72025e0a-2d8e-4106-8603-ad594130e4e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.121 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf19fb67c-61 in ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.124 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf19fb67c-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.124 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e5fa97-ed40-4b7f-8ebf-b3112dcc34bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.125 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c21cb94b-ec17-49c2-828f-870cea99727d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 systemd-udevd[316048]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:28:13 np0005588919 systemd-machined[194361]: New machine qemu-108-instance-000000cc.
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.145 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[ab72129c-a5f2-4cb2-80b0-b6e5fb1527f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.1493] device (tap52fb2315-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.1500] device (tap52fb2315-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.152 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 systemd[1]: Started Virtual Machine qemu-108-instance-000000cc.
Jan 20 10:28:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:13Z|00925|binding|INFO|Setting lport 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 ovn-installed in OVS
Jan 20 10:28:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:13Z|00926|binding|INFO|Setting lport 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 up in Southbound
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.160 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.160 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7664a407-5836-473a-94a6-5daa99433b70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.192 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4083ea9e-8ae7-4350-a40e-4353b71863d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.198 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7f896c0f-9970-4131-813c-dc1900e68660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.1992] manager: (tapf19fb67c-60): new Veth device (/org/freedesktop/NetworkManager/Devices/390)
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.236 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e20d5f3d-cffe-4160-8139-f9df010718da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.240 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9e88f065-cfbb-4728-a6f3-e1a68d636850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.2649] device (tapf19fb67c-60): carrier: link connected
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.269 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8a39ac85-17d3-4a0b-8f9f-222b34a96edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.286 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c010b914-a23c-43fa-b562-70a25225290c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19fb67c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:c0:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 793219, 'reachable_time': 44098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316081, 'error': None, 'target': 'ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.302 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[45cfcf99-6480-49a3-9586-d3c728795db3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:c0c7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 793219, 'tstamp': 793219}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316082, 'error': None, 'target': 'ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.322 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f37416-d0da-4381-ae82-10d35f7f02b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19fb67c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:c0:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 793219, 'reachable_time': 44098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 316083, 'error': None, 'target': 'ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.358 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[620d2215-ec19-4868-9eaa-5d46c9e42a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.417 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[36bf33b2-52b9-4489-9a81-01d5b9b14dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.419 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19fb67c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.419 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.420 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf19fb67c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.422 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 kernel: tapf19fb67c-60: entered promiscuous mode
Jan 20 10:28:13 np0005588919 NetworkManager[49104]: <info>  [1768922893.4227] manager: (tapf19fb67c-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.424 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf19fb67c-60, col_values=(('external_ids', {'iface-id': 'd4b30c77-23b1-48b0-a6c8-4cf53f9840de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:13 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:13Z|00927|binding|INFO|Releasing lport d4b30c77-23b1-48b0-a6c8-4cf53f9840de from this chassis (sb_readonly=0)
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.426 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.438 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.440 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f19fb67c-6bab-4253-851e-ede5bb26f589.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f19fb67c-6bab-4253-851e-ede5bb26f589.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.441 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f20e27c5-435d-4e1f-83dd-387ffd5e80b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.441 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-f19fb67c-6bab-4253-851e-ede5bb26f589
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/f19fb67c-6bab-4253-851e-ede5bb26f589.pid.haproxy
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID f19fb67c-6bab-4253-851e-ede5bb26f589
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:28:13 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:13.442 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589', 'env', 'PROCESS_TAG=haproxy-f19fb67c-6bab-4253-851e-ede5bb26f589', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f19fb67c-6bab-4253-851e-ede5bb26f589.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.457 225859 DEBUG nova.compute.manager [req-d7cbdf50-7af4-4d46-89ee-4a30bdc27cb0 req-0327bc7f-2d0e-4135-9d93-7016cede58c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.458 225859 DEBUG oslo_concurrency.lockutils [req-d7cbdf50-7af4-4d46-89ee-4a30bdc27cb0 req-0327bc7f-2d0e-4135-9d93-7016cede58c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.458 225859 DEBUG oslo_concurrency.lockutils [req-d7cbdf50-7af4-4d46-89ee-4a30bdc27cb0 req-0327bc7f-2d0e-4135-9d93-7016cede58c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.458 225859 DEBUG oslo_concurrency.lockutils [req-d7cbdf50-7af4-4d46-89ee-4a30bdc27cb0 req-0327bc7f-2d0e-4135-9d93-7016cede58c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.459 225859 DEBUG nova.compute.manager [req-d7cbdf50-7af4-4d46-89ee-4a30bdc27cb0 req-0327bc7f-2d0e-4135-9d93-7016cede58c5 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Processing event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:28:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.565 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.589 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922893.5886781, 770605b0-4686-4d97-9f82-7ed299482f50 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.590 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] VM Started (Lifecycle Event)#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.593 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.597 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.600 225859 INFO nova.virt.libvirt.driver [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Instance spawned successfully.#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.601 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.615 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:28:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:28:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3549410671' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:28:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:28:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3549410671' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.621 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.624 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.624 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.625 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.625 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.625 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.626 225859 DEBUG nova.virt.libvirt.driver [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.656 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.657 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922893.5890388, 770605b0-4686-4d97-9f82-7ed299482f50 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.657 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.690 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.697 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922893.5960007, 770605b0-4686-4d97-9f82-7ed299482f50 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.698 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.701 225859 INFO nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Took 5.63 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.702 225859 DEBUG nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.712 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.717 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.737 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.745 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.746 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.750 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.750 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.760 225859 INFO nova.compute.manager [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Took 6.58 seconds to build instance.#033[00m
Jan 20 10:28:13 np0005588919 nova_compute[225855]: 2026-01-20 15:28:13.775 225859 DEBUG oslo_concurrency.lockutils [None req-363dc4c9-30e3-49d5-9b76-6537f931887c 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:13 np0005588919 podman[316157]: 2026-01-20 15:28:13.828944475 +0000 UTC m=+0.049165044 container create e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:28:13 np0005588919 systemd[1]: Started libpod-conmon-e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f.scope.
Jan 20 10:28:13 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:28:13 np0005588919 podman[316157]: 2026-01-20 15:28:13.803398536 +0000 UTC m=+0.023619125 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:28:13 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/987c681a0ffce47fad96bd241a3b13e2bd7d14a562c7f3c13eeeccc3550197bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:28:13 np0005588919 podman[316157]: 2026-01-20 15:28:13.913841889 +0000 UTC m=+0.134062488 container init e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 20 10:28:13 np0005588919 podman[316157]: 2026-01-20 15:28:13.919032778 +0000 UTC m=+0.139253337 container start e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:28:13 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [NOTICE]   (316176) : New worker (316178) forked
Jan 20 10:28:13 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [NOTICE]   (316176) : Loading success.
Jan 20 10:28:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:14.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.543 225859 DEBUG nova.compute.manager [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.544 225859 DEBUG oslo_concurrency.lockutils [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.544 225859 DEBUG oslo_concurrency.lockutils [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.544 225859 DEBUG oslo_concurrency.lockutils [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.544 225859 DEBUG nova.compute.manager [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:15 np0005588919 nova_compute[225855]: 2026-01-20 15:28:15.545 225859 WARNING nova.compute.manager [req-2993ec33-9fad-4747-824b-99d38fac6c43 req-e488644d-2645-43d9-9564-8e9afdbb106f 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:28:16 np0005588919 podman[316188]: 2026-01-20 15:28:16.009898484 +0000 UTC m=+0.054780545 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 20 10:28:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:16.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:16 np0005588919 nova_compute[225855]: 2026-01-20 15:28:16.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:16.449 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:16.449 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:16.450 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:16.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:17 np0005588919 nova_compute[225855]: 2026-01-20 15:28:17.275 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:17 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:17Z|00928|binding|INFO|Releasing lport d4b30c77-23b1-48b0-a6c8-4cf53f9840de from this chassis (sb_readonly=0)
Jan 20 10:28:17 np0005588919 nova_compute[225855]: 2026-01-20 15:28:17.455 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:17 np0005588919 NetworkManager[49104]: <info>  [1768922897.4572] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Jan 20 10:28:17 np0005588919 NetworkManager[49104]: <info>  [1768922897.4583] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/393)
Jan 20 10:28:17 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:17Z|00929|binding|INFO|Releasing lport d4b30c77-23b1-48b0-a6c8-4cf53f9840de from this chassis (sb_readonly=0)
Jan 20 10:28:17 np0005588919 nova_compute[225855]: 2026-01-20 15:28:17.491 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:17 np0005588919 nova_compute[225855]: 2026-01-20 15:28:17.496 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:18.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:18.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.566 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.596 225859 DEBUG nova.compute.manager [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.596 225859 DEBUG nova.compute.manager [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing instance network info cache due to event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.597 225859 DEBUG oslo_concurrency.lockutils [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.597 225859 DEBUG oslo_concurrency.lockutils [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:18 np0005588919 nova_compute[225855]: 2026-01-20 15:28:18.597 225859 DEBUG nova.network.neutron [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:28:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:20.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:20 np0005588919 nova_compute[225855]: 2026-01-20 15:28:20.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:28:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:21 np0005588919 nova_compute[225855]: 2026-01-20 15:28:21.560 225859 DEBUG nova.network.neutron [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updated VIF entry in instance network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:28:21 np0005588919 nova_compute[225855]: 2026-01-20 15:28:21.561 225859 DEBUG nova.network.neutron [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:21 np0005588919 nova_compute[225855]: 2026-01-20 15:28:21.585 225859 DEBUG oslo_concurrency.lockutils [req-ad52a640-ea27-483d-86f1-e208ba45eedc req-90001ff5-4262-4c4d-9aae-f2132a73eb32 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:22 np0005588919 nova_compute[225855]: 2026-01-20 15:28:22.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:22.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:28:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:22.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:28:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:23 np0005588919 nova_compute[225855]: 2026-01-20 15:28:23.633 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:24.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:24.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:26 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:26Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:08:60 10.100.0.14
Jan 20 10:28:26 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:26Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:08:60 10.100.0.14
Jan 20 10:28:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:26.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:26.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:27 np0005588919 nova_compute[225855]: 2026-01-20 15:28:27.281 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:28.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:28.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:28 np0005588919 nova_compute[225855]: 2026-01-20 15:28:28.636 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:30.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:30.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:31 np0005588919 nova_compute[225855]: 2026-01-20 15:28:31.895 225859 INFO nova.compute.manager [None req-4885eae6-f1d4-421f-8053-7401edab03ac 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Get console output#033[00m
Jan 20 10:28:31 np0005588919 nova_compute[225855]: 2026-01-20 15:28:31.903 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:28:32 np0005588919 nova_compute[225855]: 2026-01-20 15:28:32.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:32.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:32.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:32.527 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:32 np0005588919 nova_compute[225855]: 2026-01-20 15:28:32.527 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:32.529 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:28:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:33 np0005588919 nova_compute[225855]: 2026-01-20 15:28:33.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:34.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:34.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:34.531 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:34 np0005588919 nova_compute[225855]: 2026-01-20 15:28:34.951 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:34 np0005588919 nova_compute[225855]: 2026-01-20 15:28:34.952 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:34 np0005588919 nova_compute[225855]: 2026-01-20 15:28:34.952 225859 DEBUG nova.objects.instance [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'flavor' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:35 np0005588919 nova_compute[225855]: 2026-01-20 15:28:35.505 225859 DEBUG nova.objects.instance [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_requests' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:35 np0005588919 nova_compute[225855]: 2026-01-20 15:28:35.519 225859 DEBUG nova.network.neutron [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:28:35 np0005588919 nova_compute[225855]: 2026-01-20 15:28:35.722 225859 DEBUG nova.policy [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:28:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:36.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:36 np0005588919 nova_compute[225855]: 2026-01-20 15:28:36.443 225859 DEBUG nova.network.neutron [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Successfully created port: 77056a83-f3ee-44a1-8cd0-fac2b5327a1e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:28:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:36.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:37 np0005588919 podman[316318]: 2026-01-20 15:28:37.056322516 +0000 UTC m=+0.087697344 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.062 225859 DEBUG nova.network.neutron [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Successfully updated port: 77056a83-f3ee-44a1-8cd0-fac2b5327a1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.084 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.084 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.084 225859 DEBUG nova.network.neutron [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.221 225859 DEBUG nova.compute.manager [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-changed-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.222 225859 DEBUG nova.compute.manager [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing instance network info cache due to event network-changed-77056a83-f3ee-44a1-8cd0-fac2b5327a1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.222 225859 DEBUG oslo_concurrency.lockutils [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:37 np0005588919 nova_compute[225855]: 2026-01-20 15:28:37.285 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:38.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:38.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:38 np0005588919 nova_compute[225855]: 2026-01-20 15:28:38.640 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:40.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.382 225859 DEBUG nova.network.neutron [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.406 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.407 225859 DEBUG oslo_concurrency.lockutils [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.408 225859 DEBUG nova.network.neutron [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing network info cache for port 77056a83-f3ee-44a1-8cd0-fac2b5327a1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.410 225859 DEBUG nova.virt.libvirt.vif [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.411 225859 DEBUG nova.network.os_vif_util [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.411 225859 DEBUG nova.network.os_vif_util [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.412 225859 DEBUG os_vif [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.412 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.413 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.413 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.416 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77056a83-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.416 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77056a83-f3, col_values=(('external_ids', {'iface-id': '77056a83-f3ee-44a1-8cd0-fac2b5327a1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:e5:d8', 'vm-uuid': '770605b0-4686-4d97-9f82-7ed299482f50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.4187] manager: (tap77056a83-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.426 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.427 225859 INFO os_vif [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3')#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.428 225859 DEBUG nova.virt.libvirt.vif [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.429 225859 DEBUG nova.network.os_vif_util [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.430 225859 DEBUG nova.network.os_vif_util [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.434 225859 DEBUG nova.virt.libvirt.guest [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] attach device xml: <interface type="ethernet">
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:1e:e5:d8"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <target dev="tap77056a83-f3"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]: </interface>
Jan 20 10:28:40 np0005588919 nova_compute[225855]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 20 10:28:40 np0005588919 kernel: tap77056a83-f3: entered promiscuous mode
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.4491] manager: (tap77056a83-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/395)
Jan 20 10:28:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:40Z|00930|binding|INFO|Claiming lport 77056a83-f3ee-44a1-8cd0-fac2b5327a1e for this chassis.
Jan 20 10:28:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:40Z|00931|binding|INFO|77056a83-f3ee-44a1-8cd0-fac2b5327a1e: Claiming fa:16:3e:1e:e5:d8 10.100.0.28
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.452 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.460 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e5:d8 10.100.0.28'], port_security=['fa:16:3e:1e:e5:d8 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '770605b0-4686-4d97-9f82-7ed299482f50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab7f4cc-800b-4c21-91a0-e2fd29d04e91, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=77056a83-f3ee-44a1-8cd0-fac2b5327a1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.461 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 77056a83-f3ee-44a1-8cd0-fac2b5327a1e in datapath d8ab95ce-159e-451b-baf0-5271f6a3160b bound to our chassis#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.462 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8ab95ce-159e-451b-baf0-5271f6a3160b#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.476 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5094a5-d4d5-433e-9f4f-ca1b04a7164a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.477 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8ab95ce-11 in ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.478 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8ab95ce-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.478 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5616fd40-e93f-496e-a033-20c2a728aba6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 systemd-udevd[316354]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.479 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bab29d36-d4e2-4db3-9778-5e54bf2562fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.4923] device (tap77056a83-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.4928] device (tap77056a83-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.494 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:40Z|00932|binding|INFO|Setting lport 77056a83-f3ee-44a1-8cd0-fac2b5327a1e ovn-installed in OVS
Jan 20 10:28:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:40Z|00933|binding|INFO|Setting lport 77056a83-f3ee-44a1-8cd0-fac2b5327a1e up in Southbound
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.496 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.495 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[0edead4d-1217-4562-b767-1a62f4a00872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:40.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.519 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4db80bc0-4f7b-40ea-918f-119d6163f8f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.545 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[63a25ad9-cb73-4794-8af5-2155bc4d67ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.545 225859 DEBUG nova.virt.libvirt.driver [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.546 225859 DEBUG nova.virt.libvirt.driver [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.546 225859 DEBUG nova.virt.libvirt.driver [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:f9:08:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.547 225859 DEBUG nova.virt.libvirt.driver [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:1e:e5:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.550 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8f74f23f-ebde-4f98-bd9b-0d1c7858f8ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.5516] manager: (tapd8ab95ce-10): new Veth device (/org/freedesktop/NetworkManager/Devices/396)
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.581 225859 DEBUG nova.virt.libvirt.guest [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:40</nova:creationTime>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    <nova:port uuid="77056a83-f3ee-44a1-8cd0-fac2b5327a1e">
Jan 20 10:28:40 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:40 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:40 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:40 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.583 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e062ba43-8d76-4c35-a16e-736fedf5688f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.586 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[de11bd56-a477-4942-9f40-768361154258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.6097] device (tapd8ab95ce-10): carrier: link connected
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.611 225859 DEBUG oslo_concurrency.lockutils [None req-8e31895a-3545-448a-a53c-ead2de407a86 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.614 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad96f8f-3c2f-4ab2-94bb-1bee3f1509dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.629 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc2c0e2-8713-4bb6-a01d-7a63b1aafa57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8ab95ce-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:e9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795953, 'reachable_time': 34661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316380, 'error': None, 'target': 'ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.646 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15bbf5d4-1ec5-4976-9f23-df3fbecd77f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:e946'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 795953, 'tstamp': 795953}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316381, 'error': None, 'target': 'ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.662 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d53efd-2eae-4666-bb20-2fc67eec268b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8ab95ce-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:e9:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795953, 'reachable_time': 34661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 316382, 'error': None, 'target': 'ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.694 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae5c2d4-8321-47c2-9698-3d0ff957f13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.745 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d07e3037-de68-42c5-954a-62c63eba4db7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.747 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8ab95ce-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.747 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.748 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8ab95ce-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 kernel: tapd8ab95ce-10: entered promiscuous mode
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.750 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 NetworkManager[49104]: <info>  [1768922920.7506] manager: (tapd8ab95ce-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.752 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8ab95ce-10, col_values=(('external_ids', {'iface-id': '09021fcc-5f8e-43f5-85a0-9ce682d692a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.753 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:40Z|00934|binding|INFO|Releasing lport 09021fcc-5f8e-43f5-85a0-9ce682d692a0 from this chassis (sb_readonly=0)
Jan 20 10:28:40 np0005588919 nova_compute[225855]: 2026-01-20 15:28:40.765 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.766 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8ab95ce-159e-451b-baf0-5271f6a3160b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8ab95ce-159e-451b-baf0-5271f6a3160b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.767 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[393598a7-f93a-443d-8372-5745e3431acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.768 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-d8ab95ce-159e-451b-baf0-5271f6a3160b
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/d8ab95ce-159e-451b-baf0-5271f6a3160b.pid.haproxy
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID d8ab95ce-159e-451b-baf0-5271f6a3160b
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:28:40 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:40.768 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'env', 'PROCESS_TAG=haproxy-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8ab95ce-159e-451b-baf0-5271f6a3160b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:28:41 np0005588919 podman[316414]: 2026-01-20 15:28:41.122345996 +0000 UTC m=+0.048758173 container create 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:28:41 np0005588919 systemd[1]: Started libpod-conmon-62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc.scope.
Jan 20 10:28:41 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:28:41 np0005588919 podman[316414]: 2026-01-20 15:28:41.096726324 +0000 UTC m=+0.023138521 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:28:41 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec07ec1121543b297cde9c315ec4473068cc96eb4066eb377bd1883358661b91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:28:41 np0005588919 podman[316414]: 2026-01-20 15:28:41.204992156 +0000 UTC m=+0.131404353 container init 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:28:41 np0005588919 podman[316414]: 2026-01-20 15:28:41.212602403 +0000 UTC m=+0.139014580 container start 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:28:41 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [NOTICE]   (316433) : New worker (316435) forked
Jan 20 10:28:41 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [NOTICE]   (316433) : Loading success.
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.588 225859 DEBUG nova.compute.manager [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.588 225859 DEBUG oslo_concurrency.lockutils [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.589 225859 DEBUG oslo_concurrency.lockutils [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.589 225859 DEBUG oslo_concurrency.lockutils [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.589 225859 DEBUG nova.compute.manager [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:41 np0005588919 nova_compute[225855]: 2026-01-20 15:28:41.589 225859 WARNING nova.compute.manager [req-d3dc5542-df92-4a45-81cf-a358e92eb7b8 req-576f3a99-7f1c-48d1-b47f-7e8927686517 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:28:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:42.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:28:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:42.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.336 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-77056a83-f3ee-44a1-8cd0-fac2b5327a1e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.337 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-77056a83-f3ee-44a1-8cd0-fac2b5327a1e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.358 225859 DEBUG nova.objects.instance [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'flavor' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.381 225859 DEBUG nova.virt.libvirt.vif [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.382 225859 DEBUG nova.network.os_vif_util [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.383 225859 DEBUG nova.network.os_vif_util [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.386 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.389 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.391 225859 DEBUG nova.virt.libvirt.driver [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Attempting to detach device tap77056a83-f3 from instance 770605b0-4686-4d97-9f82-7ed299482f50 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.392 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] detach device xml: <interface type="ethernet">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:1e:e5:d8"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <target dev="tap77056a83-f3"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </interface>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.399 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.402 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface>not found in domain: <domain type='kvm' id='108'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <name>instance-000000cc</name>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <uuid>770605b0-4686-4d97-9f82-7ed299482f50</uuid>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:40</nova:creationTime>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:port uuid="77056a83-f3ee-44a1-8cd0-fac2b5327a1e">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <resource>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </resource>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='serial'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='uuid'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk' index='2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk.config' index='1'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:f9:08:60'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='tap52fb2315-9e'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:1e:e5:d8'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='tap77056a83-f3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='net1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </target>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </console>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c126,c530</label>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c126,c530</imagelabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.404 225859 INFO nova.virt.libvirt.driver [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully detached device tap77056a83-f3 from instance 770605b0-4686-4d97-9f82-7ed299482f50 from the persistent domain config.#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.405 225859 DEBUG nova.virt.libvirt.driver [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] (1/8): Attempting to detach device tap77056a83-f3 with device alias net1 from instance 770605b0-4686-4d97-9f82-7ed299482f50 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.405 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] detach device xml: <interface type="ethernet">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <mac address="fa:16:3e:1e:e5:d8"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <model type="virtio"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <mtu size="1442"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <target dev="tap77056a83-f3"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </interface>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.410 225859 DEBUG nova.network.neutron [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updated VIF entry in instance network info cache for port 77056a83-f3ee-44a1-8cd0-fac2b5327a1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.411 225859 DEBUG nova.network.neutron [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.436 225859 DEBUG oslo_concurrency.lockutils [req-6a9af8bc-ec26-41c4-ad07-087e9e24e499 req-337a11f2-6747-4b8d-b5d3-3ec4cdbe76ae 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:43 np0005588919 kernel: tap77056a83-f3 (unregistering): left promiscuous mode
Jan 20 10:28:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:43 np0005588919 NetworkManager[49104]: <info>  [1768922923.5130] device (tap77056a83-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.524 225859 DEBUG nova.virt.libvirt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Received event <DeviceRemovedEvent: 1768922923.5232997, 770605b0-4686-4d97-9f82-7ed299482f50 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 20 10:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:43Z|00935|binding|INFO|Releasing lport 77056a83-f3ee-44a1-8cd0-fac2b5327a1e from this chassis (sb_readonly=0)
Jan 20 10:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:43Z|00936|binding|INFO|Setting lport 77056a83-f3ee-44a1-8cd0-fac2b5327a1e down in Southbound
Jan 20 10:28:43 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:43Z|00937|binding|INFO|Removing iface tap77056a83-f3 ovn-installed in OVS
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.579 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.581 225859 DEBUG nova.virt.libvirt.driver [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Start waiting for the detach event from libvirt for device tap77056a83-f3 with device alias net1 for instance 770605b0-4686-4d97-9f82-7ed299482f50 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.581 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.582 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.585 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface>not found in domain: <domain type='kvm' id='108'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <name>instance-000000cc</name>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <uuid>770605b0-4686-4d97-9f82-7ed299482f50</uuid>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:40</nova:creationTime>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:port uuid="77056a83-f3ee-44a1-8cd0-fac2b5327a1e">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <resource>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </resource>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='serial'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='uuid'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk' index='2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk.config' index='1'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:f9:08:60'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target dev='tap52fb2315-9e'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      </target>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </console>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c126,c530</label>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c126,c530</imagelabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.585 225859 INFO nova.virt.libvirt.driver [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully detached device tap77056a83-f3 from instance 770605b0-4686-4d97-9f82-7ed299482f50 from the live domain config.#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.586 225859 DEBUG nova.virt.libvirt.vif [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.586 225859 DEBUG nova.network.os_vif_util [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.587 225859 DEBUG nova.network.os_vif_util [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.587 225859 DEBUG os_vif [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.589 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77056a83-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.590 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.595 225859 INFO os_vif [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3')#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.595 225859 DEBUG nova.virt.libvirt.guest [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:43</nova:creationTime>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:43 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:43 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:43 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.595 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e5:d8 10.100.0.28'], port_security=['fa:16:3e:1e:e5:d8 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '770605b0-4686-4d97-9f82-7ed299482f50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ab7f4cc-800b-4c21-91a0-e2fd29d04e91, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=77056a83-f3ee-44a1-8cd0-fac2b5327a1e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.596 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 77056a83-f3ee-44a1-8cd0-fac2b5327a1e in datapath d8ab95ce-159e-451b-baf0-5271f6a3160b unbound from our chassis#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.597 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8ab95ce-159e-451b-baf0-5271f6a3160b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.598 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ebf6446-90d9-4b57-90df-ff608afb8b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.598 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b namespace which is not needed anymore#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.696 225859 DEBUG nova.compute.manager [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.697 225859 DEBUG oslo_concurrency.lockutils [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.697 225859 DEBUG oslo_concurrency.lockutils [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.697 225859 DEBUG oslo_concurrency.lockutils [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.697 225859 DEBUG nova.compute.manager [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.698 225859 WARNING nova.compute.manager [req-7f6b7969-a575-4c67-816d-46a3ae8f2044 req-c6a518c5-ffb6-4594-abd4-48fdfeba6652 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [NOTICE]   (316433) : haproxy version is 2.8.14-c23fe91
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [NOTICE]   (316433) : path to executable is /usr/sbin/haproxy
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [WARNING]  (316433) : Exiting Master process...
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [WARNING]  (316433) : Exiting Master process...
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [ALERT]    (316433) : Current worker (316435) exited with code 143 (Terminated)
Jan 20 10:28:43 np0005588919 neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b[316429]: [WARNING]  (316433) : All workers exited. Exiting... (0)
Jan 20 10:28:43 np0005588919 systemd[1]: libpod-62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc.scope: Deactivated successfully.
Jan 20 10:28:43 np0005588919 podman[316469]: 2026-01-20 15:28:43.733240821 +0000 UTC m=+0.044737449 container died 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:28:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc-userdata-shm.mount: Deactivated successfully.
Jan 20 10:28:43 np0005588919 systemd[1]: var-lib-containers-storage-overlay-ec07ec1121543b297cde9c315ec4473068cc96eb4066eb377bd1883358661b91-merged.mount: Deactivated successfully.
Jan 20 10:28:43 np0005588919 podman[316469]: 2026-01-20 15:28:43.773277344 +0000 UTC m=+0.084773972 container cleanup 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:28:43 np0005588919 systemd[1]: libpod-conmon-62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc.scope: Deactivated successfully.
Jan 20 10:28:43 np0005588919 podman[316504]: 2026-01-20 15:28:43.833449832 +0000 UTC m=+0.039018075 container remove 62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.838 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[df4a7a0b-7f57-4b24-956d-26ee25e740d8]: (4, ('Tue Jan 20 03:28:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b (62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc)\n62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc\nTue Jan 20 03:28:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b (62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc)\n62b8365da44e9b0899fd7c930b462a3d53b7c0b3ab06aa3a347b5ba7ee813dbc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.840 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8665d42c-bf53-4666-94ef-b9a1af30562b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.841 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8ab95ce-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.843 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 kernel: tapd8ab95ce-10: left promiscuous mode
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.845 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.847 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6ca87c-a3f4-43ea-bd0b-1c8eab7e945f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 nova_compute[225855]: 2026-01-20 15:28:43.857 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.870 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6de378a1-9863-4905-a5b8-a7c4b1d8c778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.871 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[7004bc9b-9bc9-44ca-8d84-55fb2adfdcbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.884 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3f42b828-b5a7-4922-a123-4a37c65a5ad0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795946, 'reachable_time': 32719, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316520, 'error': None, 'target': 'ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:43 np0005588919 systemd[1]: run-netns-ovnmeta\x2dd8ab95ce\x2d159e\x2d451b\x2dbaf0\x2d5271f6a3160b.mount: Deactivated successfully.
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.888 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8ab95ce-159e-451b-baf0-5271f6a3160b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:28:43 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:43.888 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa7837d-6414-47ad-97b1-24b755609a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:44.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:44.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.580 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.580 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.580 225859 DEBUG nova.network.neutron [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.683 225859 DEBUG nova.compute.manager [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-deleted-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.683 225859 INFO nova.compute.manager [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Neutron deleted interface 77056a83-f3ee-44a1-8cd0-fac2b5327a1e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.684 225859 DEBUG nova.network.neutron [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.725 225859 DEBUG nova.objects.instance [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'system_metadata' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.758 225859 DEBUG nova.objects.instance [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lazy-loading 'flavor' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.793 225859 DEBUG nova.virt.libvirt.vif [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.793 225859 DEBUG nova.network.os_vif_util [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.794 225859 DEBUG nova.network.os_vif_util [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.797 225859 DEBUG nova.virt.libvirt.guest [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.800 225859 DEBUG nova.virt.libvirt.guest [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface>not found in domain: <domain type='kvm' id='108'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <name>instance-000000cc</name>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <uuid>770605b0-4686-4d97-9f82-7ed299482f50</uuid>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:43</nova:creationTime>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <resource>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </resource>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='serial'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='uuid'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk' index='2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk.config' index='1'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:f9:08:60'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='tap52fb2315-9e'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </target>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </console>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c126,c530</label>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c126,c530</imagelabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.800 225859 DEBUG nova.virt.libvirt.guest [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.804 225859 DEBUG nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-unplugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.804 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.804 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.804 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 DEBUG nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-unplugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 WARNING nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-unplugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 DEBUG nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.805 225859 DEBUG oslo_concurrency.lockutils [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.806 225859 DEBUG nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.806 225859 WARNING nova.compute.manager [req-d27a989b-3052-4ace-82cf-75d5a82a9d84 req-bc1d4cc0-04e7-4c7b-af2a-5a50cd8e24be 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-plugged-77056a83-f3ee-44a1-8cd0-fac2b5327a1e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.806 225859 DEBUG nova.virt.libvirt.guest [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:e5:d8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77056a83-f3"/></interface>not found in domain: <domain type='kvm' id='108'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <name>instance-000000cc</name>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <uuid>770605b0-4686-4d97-9f82-7ed299482f50</uuid>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:43</nova:creationTime>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <memory unit='KiB'>131072</memory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <vcpu placement='static'>1</vcpu>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <resource>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <partition>/machine</partition>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </resource>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <sysinfo type='smbios'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='manufacturer'>RDO</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='product'>OpenStack Compute</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='serial'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='uuid'>770605b0-4686-4d97-9f82-7ed299482f50</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <entry name='family'>Virtual Machine</entry>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <boot dev='hd'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <smbios mode='sysinfo'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <vmcoreinfo state='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <cpu mode='custom' match='exact' check='full'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <model fallback='forbid'>Nehalem</model>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='x2apic'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='hypervisor'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <feature policy='require' name='vme'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <clock offset='utc'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='pit' tickpolicy='delay'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <timer name='hpet' present='no'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_poweroff>destroy</on_poweroff>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_reboot>restart</on_reboot>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <on_crash>destroy</on_crash>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <disk type='network' device='disk'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk' index='2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='vda' bus='virtio'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='virtio-disk0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <disk type='network' device='cdrom'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='qemu' type='raw' cache='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <auth username='openstack'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <secret type='ceph' uuid='e399cf45-e6b6-5393-99f1-75c601d3f188'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source protocol='rbd' name='vms/770605b0-4686-4d97-9f82-7ed299482f50_disk.config' index='1'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.100' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.102' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <host name='192.168.122.101' port='6789'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='sda' bus='sata'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <readonly/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='sata0-0-0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='0' model='pcie-root'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pcie.0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='1' port='0x10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='2' port='0x11'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='3' port='0x12'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='4' port='0x13'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='5' port='0x14'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='6' port='0x15'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='7' port='0x16'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='8' port='0x17'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.8'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='9' port='0x18'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.9'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='10' port='0x19'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='11' port='0x1a'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.11'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='12' port='0x1b'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.12'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='13' port='0x1c'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.13'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='14' port='0x1d'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.14'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='15' port='0x1e'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.15'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='16' port='0x1f'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.16'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='17' port='0x20'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.17'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='18' port='0x21'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.18'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='19' port='0x22'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.19'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='20' port='0x23'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.20'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='21' port='0x24'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.21'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='22' port='0x25'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.22'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='23' port='0x26'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.23'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='24' port='0x27'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.24'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-root-port'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target chassis='25' port='0x28'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.25'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model name='pcie-pci-bridge'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='pci.26'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='usb'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <controller type='sata' index='0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='ide'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </controller>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <interface type='ethernet'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <mac address='fa:16:3e:f9:08:60'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target dev='tap52fb2315-9e'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model type='virtio'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <driver name='vhost' rx_queue_size='512'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <mtu size='1442'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='net0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <serial type='pty'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target type='isa-serial' port='0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:        <model name='isa-serial'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      </target>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <console type='pty' tty='/dev/pts/0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <source path='/dev/pts/0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <log file='/var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50/console.log' append='off'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <target type='serial' port='0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='serial0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </console>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='tablet' bus='usb'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='usb' bus='0' port='1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='mouse' bus='ps2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input1'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <input type='keyboard' bus='ps2'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='input2'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </input>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <listen type='address' address='::0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </graphics>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <audio id='1' type='none'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <model type='virtio' heads='1' primary='yes'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='video0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <watchdog model='itco' action='reset'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='watchdog0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </watchdog>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <memballoon model='virtio'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <stats period='10'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='balloon0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <rng model='virtio'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <backend model='random'>/dev/urandom</backend>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <alias name='rng0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <label>system_u:system_r:svirt_t:s0:c126,c530</label>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c126,c530</imagelabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <label>+107:+107</label>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <imagelabel>+107:+107</imagelabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </seclabel>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.807 225859 WARNING nova.virt.libvirt.driver [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Detaching interface fa:16:3e:1e:e5:d8 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap77056a83-f3' not found.#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.807 225859 DEBUG nova.virt.libvirt.vif [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.808 225859 DEBUG nova.network.os_vif_util [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converting VIF {"id": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "address": "fa:16:3e:1e:e5:d8", "network": {"id": "d8ab95ce-159e-451b-baf0-5271f6a3160b", "bridge": "br-int", "label": "tempest-network-smoke--1027589119", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77056a83-f3", "ovs_interfaceid": "77056a83-f3ee-44a1-8cd0-fac2b5327a1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.808 225859 DEBUG nova.network.os_vif_util [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.809 225859 DEBUG os_vif [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.811 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.811 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77056a83-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.811 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.813 225859 INFO os_vif [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e5:d8,bridge_name='br-int',has_traffic_filtering=True,id=77056a83-f3ee-44a1-8cd0-fac2b5327a1e,network=Network(d8ab95ce-159e-451b-baf0-5271f6a3160b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77056a83-f3')#033[00m
Jan 20 10:28:45 np0005588919 nova_compute[225855]: 2026-01-20 15:28:45.814 225859 DEBUG nova.virt.libvirt.guest [req-72e41404-7980-41eb-ad27-c96cfd6fabc7 req-880a0b72-256e-40e0-aae8-c6805c21012b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:name>tempest-TestNetworkBasicOps-server-774862138</nova:name>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:creationTime>2026-01-20 15:28:45</nova:creationTime>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:flavor name="m1.nano">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:memory>128</nova:memory>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:disk>1</nova:disk>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:swap>0</nova:swap>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:vcpus>1</nova:vcpus>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:flavor>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:owner>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  <nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    <nova:port uuid="52fb2315-9ec5-47a4-af4a-e0ed5e4caf21">
Jan 20 10:28:45 np0005588919 nova_compute[225855]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:    </nova:port>
Jan 20 10:28:45 np0005588919 nova_compute[225855]:  </nova:ports>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: </nova:instance>
Jan 20 10:28:45 np0005588919 nova_compute[225855]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 20 10:28:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:46.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:46.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:47 np0005588919 podman[316545]: 2026-01-20 15:28:47.019998362 +0000 UTC m=+0.059382007 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:28:47 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:47Z|00938|binding|INFO|Releasing lport d4b30c77-23b1-48b0-a6c8-4cf53f9840de from this chassis (sb_readonly=0)
Jan 20 10:28:47 np0005588919 nova_compute[225855]: 2026-01-20 15:28:47.727 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:48.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.401 225859 INFO nova.network.neutron [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Port 77056a83-f3ee-44a1-8cd0-fac2b5327a1e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.401 225859 DEBUG nova.network.neutron [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.420 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.440 225859 DEBUG oslo_concurrency.lockutils [None req-8a5274d3-ca85-4f1c-baeb-511d7dc60fdf 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "interface-770605b0-4686-4d97-9f82-7ed299482f50-77056a83-f3ee-44a1-8cd0-fac2b5327a1e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:48.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.591 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.864 225859 DEBUG nova.compute.manager [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.864 225859 DEBUG nova.compute.manager [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing instance network info cache due to event network-changed-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.864 225859 DEBUG oslo_concurrency.lockutils [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.865 225859 DEBUG oslo_concurrency.lockutils [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.865 225859 DEBUG nova.network.neutron [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Refreshing network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.928 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.929 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.929 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.929 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.929 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.930 225859 INFO nova.compute.manager [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Terminating instance#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.931 225859 DEBUG nova.compute.manager [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:28:48 np0005588919 kernel: tap52fb2315-9e (unregistering): left promiscuous mode
Jan 20 10:28:48 np0005588919 NetworkManager[49104]: <info>  [1768922928.9849] device (tap52fb2315-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:28:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:48Z|00939|binding|INFO|Releasing lport 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 from this chassis (sb_readonly=0)
Jan 20 10:28:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:48Z|00940|binding|INFO|Setting lport 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 down in Southbound
Jan 20 10:28:48 np0005588919 ovn_controller[130490]: 2026-01-20T15:28:48Z|00941|binding|INFO|Removing iface tap52fb2315-9e ovn-installed in OVS
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.991 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:48 np0005588919 nova_compute[225855]: 2026-01-20 15:28:48.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:48.998 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:08:60 10.100.0.14'], port_security=['fa:16:3e:f9:08:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '770605b0-4686-4d97-9f82-7ed299482f50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19fb67c-6bab-4253-851e-ede5bb26f589', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b57a9b16-bf8b-47b4-a097-c9a9044c2225', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c4c5a39-3036-4b6a-873b-b8673f881902, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:48.999 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 in datapath f19fb67c-6bab-4253-851e-ede5bb26f589 unbound from our chassis#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.000 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f19fb67c-6bab-4253-851e-ede5bb26f589, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.001 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[612cf865-0d16-4a07-b603-58dc7876232a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.001 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589 namespace which is not needed anymore#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d000000cc.scope: Deactivated successfully.
Jan 20 10:28:49 np0005588919 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d000000cc.scope: Consumed 14.760s CPU time.
Jan 20 10:28:49 np0005588919 systemd-machined[194361]: Machine qemu-108-instance-000000cc terminated.
Jan 20 10:28:49 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [NOTICE]   (316176) : haproxy version is 2.8.14-c23fe91
Jan 20 10:28:49 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [NOTICE]   (316176) : path to executable is /usr/sbin/haproxy
Jan 20 10:28:49 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [WARNING]  (316176) : Exiting Master process...
Jan 20 10:28:49 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [ALERT]    (316176) : Current worker (316178) exited with code 143 (Terminated)
Jan 20 10:28:49 np0005588919 neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589[316172]: [WARNING]  (316176) : All workers exited. Exiting... (0)
Jan 20 10:28:49 np0005588919 systemd[1]: libpod-e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f.scope: Deactivated successfully.
Jan 20 10:28:49 np0005588919 podman[316618]: 2026-01-20 15:28:49.134880284 +0000 UTC m=+0.047848957 container died e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:28:49 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.173 225859 INFO nova.virt.libvirt.driver [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Instance destroyed successfully.#033[00m
Jan 20 10:28:49 np0005588919 systemd[1]: var-lib-containers-storage-overlay-987c681a0ffce47fad96bd241a3b13e2bd7d14a562c7f3c13eeeccc3550197bb-merged.mount: Deactivated successfully.
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.174 225859 DEBUG nova.objects.instance [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 770605b0-4686-4d97-9f82-7ed299482f50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:28:49 np0005588919 podman[316618]: 2026-01-20 15:28:49.186598171 +0000 UTC m=+0.099566854 container cleanup e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.191 225859 DEBUG nova.virt.libvirt.vif [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-774862138',display_name='tempest-TestNetworkBasicOps-server-774862138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-774862138',id=204,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHV4gRYgjRGDnaHxkSMJQMAVAwkk8jLZAdWo/fxENkiAFcbr1CXMhNYzs2hTTM5NoLjR4u2qMt5dKH7C9b4LHMK/3O49DJNSLMAUfk3HkTe/ulPxPFwHn3vfQCPGwBrM5A==',key_name='tempest-TestNetworkBasicOps-1250991011',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:28:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-u6z01i8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:28:13Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=770605b0-4686-4d97-9f82-7ed299482f50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.191 225859 DEBUG nova.network.os_vif_util [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.192 225859 DEBUG nova.network.os_vif_util [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.192 225859 DEBUG os_vif [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:28:49 np0005588919 systemd[1]: libpod-conmon-e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f.scope: Deactivated successfully.
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.195 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52fb2315-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.238 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.242 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.244 225859 INFO os_vif [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:08:60,bridge_name='br-int',has_traffic_filtering=True,id=52fb2315-9ec5-47a4-af4a-e0ed5e4caf21,network=Network(f19fb67c-6bab-4253-851e-ede5bb26f589),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52fb2315-9e')#033[00m
Jan 20 10:28:49 np0005588919 podman[316660]: 2026-01-20 15:28:49.254482979 +0000 UTC m=+0.047434405 container remove e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.256 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3435e13c-19e3-4ba8-b72b-fc73d31169aa]: (4, ('Tue Jan 20 03:28:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589 (e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f)\ne68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f\nTue Jan 20 03:28:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589 (e68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f)\ne68935dd090d02c7d09ee6eaa42b1b2f1beffe055ed6b7bb3242ce69225c8b7f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.259 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f41917d5-ed05-41f2-8d9e-7e3616671404]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.260 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19fb67c-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:28:49 np0005588919 kernel: tapf19fb67c-60: left promiscuous mode
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.267 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[105f5e90-81ca-4495-a436-73536b1ad110]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.278 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.285 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa78ac9-a716-42c7-8141-350bee313e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.286 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[790153df-4f54-4024-b577-ad01b9f93383]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.299 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[78faf0de-5559-4de3-a251-a073ba4d4fcf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 793211, 'reachable_time': 15583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316693, 'error': None, 'target': 'ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 systemd[1]: run-netns-ovnmeta\x2df19fb67c\x2d6bab\x2d4253\x2d851e\x2dede5bb26f589.mount: Deactivated successfully.
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.303 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f19fb67c-6bab-4253-851e-ede5bb26f589 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:28:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:28:49.303 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[e65f2c15-4629-44fb-8f84-cd1e6ee0ff81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.734 225859 INFO nova.virt.libvirt.driver [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Deleting instance files /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50_del#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.734 225859 INFO nova.virt.libvirt.driver [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Deletion of /var/lib/nova/instances/770605b0-4686-4d97-9f82-7ed299482f50_del complete#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.793 225859 INFO nova.compute.manager [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.794 225859 DEBUG oslo.service.loopingcall [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.795 225859 DEBUG nova.compute.manager [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.795 225859 DEBUG nova.network.neutron [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.867 225859 DEBUG nova.network.neutron [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updated VIF entry in instance network info cache for port 52fb2315-9ec5-47a4-af4a-e0ed5e4caf21. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.868 225859 DEBUG nova.network.neutron [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [{"id": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "address": "fa:16:3e:f9:08:60", "network": {"id": "f19fb67c-6bab-4253-851e-ede5bb26f589", "bridge": "br-int", "label": "tempest-network-smoke--2116756150", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52fb2315-9e", "ovs_interfaceid": "52fb2315-9ec5-47a4-af4a-e0ed5e4caf21", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:49 np0005588919 nova_compute[225855]: 2026-01-20 15:28:49.892 225859 DEBUG oslo_concurrency.lockutils [req-83d551fb-3df3-4a8b-b9bf-713402802e9b req-379ec33f-9859-4069-baa6-6765adb999ff 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-770605b0-4686-4d97-9f82-7ed299482f50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 20 10:28:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:50.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:50.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.852 225859 DEBUG nova.network.neutron [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.872 225859 INFO nova.compute.manager [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Took 1.08 seconds to deallocate network for instance.#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.925 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.926 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.981 225859 DEBUG nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-unplugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.982 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.982 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.982 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.983 225859 DEBUG nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-unplugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.983 225859 WARNING nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-unplugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.983 225859 DEBUG nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.984 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "770605b0-4686-4d97-9f82-7ed299482f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.984 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.984 225859 DEBUG oslo_concurrency.lockutils [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.984 225859 DEBUG nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] No waiting events found dispatching network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.985 225859 WARNING nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received unexpected event network-vif-plugged-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:28:50 np0005588919 nova_compute[225855]: 2026-01-20 15:28:50.985 225859 DEBUG nova.compute.manager [req-2433cd5c-77e2-4b69-afa4-6720f7012c54 req-76336cc7-02e5-4eec-9a66-6abbdcde6260 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Received event network-vif-deleted-52fb2315-9ec5-47a4-af4a-e0ed5e4caf21 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.002 225859 DEBUG oslo_concurrency.processutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:28:51 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:28:51 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/194072997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.470 225859 DEBUG oslo_concurrency.processutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.476 225859 DEBUG nova.compute.provider_tree [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.496 225859 DEBUG nova.scheduler.client.report [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.522 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.565 225859 INFO nova.scheduler.client.report [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 770605b0-4686-4d97-9f82-7ed299482f50#033[00m
Jan 20 10:28:51 np0005588919 nova_compute[225855]: 2026-01-20 15:28:51.673 225859 DEBUG oslo_concurrency.lockutils [None req-58120d51-a644-4f05-98dd-40aa8321427d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "770605b0-4686-4d97-9f82-7ed299482f50" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:28:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:52.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:52.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:53 np0005588919 nova_compute[225855]: 2026-01-20 15:28:53.648 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:54 np0005588919 nova_compute[225855]: 2026-01-20 15:28:54.239 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:54.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:54.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:55 np0005588919 nova_compute[225855]: 2026-01-20 15:28:55.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:55 np0005588919 nova_compute[225855]: 2026-01-20 15:28:55.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:56.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:56.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:28:58.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:28:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:28:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:28:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:28:58.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:28:58 np0005588919 nova_compute[225855]: 2026-01-20 15:28:58.692 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:28:59 np0005588919 nova_compute[225855]: 2026-01-20 15:28:59.241 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:29:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:00.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:29:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:00.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:02.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:02.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:03 np0005588919 nova_compute[225855]: 2026-01-20 15:29:03.694 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:04 np0005588919 nova_compute[225855]: 2026-01-20 15:29:04.168 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922929.1668205, 770605b0-4686-4d97-9f82-7ed299482f50 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:29:04 np0005588919 nova_compute[225855]: 2026-01-20 15:29:04.169 225859 INFO nova.compute.manager [-] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:29:04 np0005588919 nova_compute[225855]: 2026-01-20 15:29:04.192 225859 DEBUG nova.compute.manager [None req-3f3cbd01-11b9-4ce9-9d8d-37904102a05f - - - - - -] [instance: 770605b0-4686-4d97-9f82-7ed299482f50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:29:04 np0005588919 nova_compute[225855]: 2026-01-20 15:29:04.282 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:04.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:04.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:06 np0005588919 nova_compute[225855]: 2026-01-20 15:29:06.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:06 np0005588919 nova_compute[225855]: 2026-01-20 15:29:06.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:29:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:06.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:06.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:07 np0005588919 podman[316753]: 2026-01-20 15:29:07.1990885 +0000 UTC m=+0.076629019 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 20 10:29:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:08.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:08.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:08 np0005588919 nova_compute[225855]: 2026-01-20 15:29:08.745 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:09 np0005588919 nova_compute[225855]: 2026-01-20 15:29:09.284 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:10.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:10.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.372 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.372 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.373 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1352108626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.800 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.975 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.977 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4251MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.978 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:11 np0005588919 nova_compute[225855]: 2026-01-20 15:29:11.978 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.066 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.067 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.094 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.126 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.127 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.176 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.209 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.237 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:12.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:12.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/439750638' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.692 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.697 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.714 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.752 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:29:12 np0005588919 nova_compute[225855]: 2026-01-20 15:29:12.753 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:29:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/640351222' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:29:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:29:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/640351222' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.752 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.753 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.753 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.770 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:29:13 np0005588919 nova_compute[225855]: 2026-01-20 15:29:13.770 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:14 np0005588919 nova_compute[225855]: 2026-01-20 15:29:14.286 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:14 np0005588919 nova_compute[225855]: 2026-01-20 15:29:14.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:14 np0005588919 nova_compute[225855]: 2026-01-20 15:29:14.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:14.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:14.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:15 np0005588919 nova_compute[225855]: 2026-01-20 15:29:15.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:16.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:16.450 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:16.451 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:16.451 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:16.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:17 np0005588919 nova_compute[225855]: 2026-01-20 15:29:17.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:18 np0005588919 podman[316859]: 2026-01-20 15:29:18.028830254 +0000 UTC m=+0.071786241 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 20 10:29:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:18.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:18.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:18 np0005588919 nova_compute[225855]: 2026-01-20 15:29:18.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.288 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:19 np0005588919 podman[317151]: 2026-01-20 15:29:19.367246338 +0000 UTC m=+0.039433247 container create f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:29:19 np0005588919 systemd[1]: Started libpod-conmon-f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6.scope.
Jan 20 10:29:19 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:29:19 np0005588919 podman[317151]: 2026-01-20 15:29:19.351434426 +0000 UTC m=+0.023621355 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 10:29:19 np0005588919 podman[317151]: 2026-01-20 15:29:19.447091937 +0000 UTC m=+0.119278866 container init f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 20 10:29:19 np0005588919 podman[317151]: 2026-01-20 15:29:19.454746776 +0000 UTC m=+0.126933685 container start f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:29:19 np0005588919 podman[317151]: 2026-01-20 15:29:19.45768212 +0000 UTC m=+0.129869029 container attach f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Jan 20 10:29:19 np0005588919 lucid_khayyam[317168]: 167 167
Jan 20 10:29:19 np0005588919 systemd[1]: libpod-f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6.scope: Deactivated successfully.
Jan 20 10:29:19 np0005588919 podman[317173]: 2026-01-20 15:29:19.502570151 +0000 UTC m=+0.027107535 container died f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 20 10:29:19 np0005588919 systemd[1]: var-lib-containers-storage-overlay-9a6874d1ab88d11d4755ca9acc94712eab8ba09638c6ed2633bdbf6aa513bb3c-merged.mount: Deactivated successfully.
Jan 20 10:29:19 np0005588919 podman[317173]: 2026-01-20 15:29:19.539032462 +0000 UTC m=+0.063569816 container remove f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=lucid_khayyam, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 10:29:19 np0005588919 systemd[1]: libpod-conmon-f1155693df52ee011de31a833a91bcd28894e3162c32382144f1a0e27c6c73e6.scope: Deactivated successfully.
Jan 20 10:29:19 np0005588919 podman[317195]: 2026-01-20 15:29:19.701592614 +0000 UTC m=+0.041215468 container create 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 20 10:29:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:19 np0005588919 systemd[1]: Started libpod-conmon-356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b.scope.
Jan 20 10:29:19 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:29:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc8fa87a1a4194b0ea37b9c4e76bfc535b3cf7f0ab2a24755828dcb087a59a39/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 20 10:29:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc8fa87a1a4194b0ea37b9c4e76bfc535b3cf7f0ab2a24755828dcb087a59a39/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 20 10:29:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc8fa87a1a4194b0ea37b9c4e76bfc535b3cf7f0ab2a24755828dcb087a59a39/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 20 10:29:19 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc8fa87a1a4194b0ea37b9c4e76bfc535b3cf7f0ab2a24755828dcb087a59a39/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 20 10:29:19 np0005588919 podman[317195]: 2026-01-20 15:29:19.684526086 +0000 UTC m=+0.024148970 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 20 10:29:19 np0005588919 podman[317195]: 2026-01-20 15:29:19.782702069 +0000 UTC m=+0.122324953 container init 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.784 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.786 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:19 np0005588919 podman[317195]: 2026-01-20 15:29:19.789602157 +0000 UTC m=+0.129225011 container start 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0)
Jan 20 10:29:19 np0005588919 podman[317195]: 2026-01-20 15:29:19.792768577 +0000 UTC m=+0.132391431 container attach 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.805 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.895 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.896 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.904 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:29:19 np0005588919 nova_compute[225855]: 2026-01-20 15:29:19.904 225859 INFO nova.compute.claims [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.043 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:20.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4253199867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.489 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.496 225859 DEBUG nova.compute.provider_tree [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.517 225859 DEBUG nova.scheduler.client.report [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.553 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.554 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:29:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:20.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.613 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.614 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.634 225859 INFO nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.651 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.781 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.783 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.783 225859 INFO nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Creating image(s)#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.807 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.837 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.866 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.872 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.941 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.943 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.944 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.944 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.975 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:20 np0005588919 nova_compute[225855]: 2026-01-20 15:29:20.980 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 af78e376-a9fb-4854-9c34-fd8c6f63390a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]: [
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:    {
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "available": false,
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "ceph_device": false,
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "lsm_data": {},
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "lvs": [],
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "path": "/dev/sr0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "rejected_reasons": [
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "Has a FileSystem",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "Insufficient space (<5GB)"
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        ],
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        "sys_api": {
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "actuators": null,
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "device_nodes": "sr0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "devname": "sr0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "human_readable_size": "482.00 KB",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "id_bus": "ata",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "model": "QEMU DVD-ROM",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "nr_requests": "2",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "parent": "/dev/sr0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "partitions": {},
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "path": "/dev/sr0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "removable": "1",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "rev": "2.5+",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "ro": "0",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "rotational": "1",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "sas_address": "",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "sas_device_handle": "",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "scheduler_mode": "mq-deadline",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "sectors": 0,
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "sectorsize": "2048",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "size": 493568.0,
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "support_discard": "2048",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "type": "disk",
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:            "vendor": "QEMU"
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:        }
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]:    }
Jan 20 10:29:21 np0005588919 reverent_ishizaka[317211]: ]
Jan 20 10:29:21 np0005588919 systemd[1]: libpod-356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b.scope: Deactivated successfully.
Jan 20 10:29:21 np0005588919 podman[317195]: 2026-01-20 15:29:21.071445584 +0000 UTC m=+1.411068458 container died 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 20 10:29:21 np0005588919 systemd[1]: libpod-356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b.scope: Consumed 1.274s CPU time.
Jan 20 10:29:21 np0005588919 systemd[1]: var-lib-containers-storage-overlay-bc8fa87a1a4194b0ea37b9c4e76bfc535b3cf7f0ab2a24755828dcb087a59a39-merged.mount: Deactivated successfully.
Jan 20 10:29:21 np0005588919 podman[317195]: 2026-01-20 15:29:21.138145808 +0000 UTC m=+1.477768662 container remove 356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 20 10:29:21 np0005588919 systemd[1]: libpod-conmon-356f49d97ebffae8ce8f45012baed55fe3ac2576de3034d947434e5874995d2b.scope: Deactivated successfully.
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.248 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 af78e376-a9fb-4854-9c34-fd8c6f63390a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.301 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.384 225859 DEBUG nova.objects.instance [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid af78e376-a9fb-4854-9c34-fd8c6f63390a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.399 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.399 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Ensure instance console log exists: /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.400 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.400 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.400 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:21 np0005588919 nova_compute[225855]: 2026-01-20 15:29:21.428 225859 DEBUG nova.policy [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:22 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:29:22 np0005588919 nova_compute[225855]: 2026-01-20 15:29:22.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:22.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:22.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:22 np0005588919 nova_compute[225855]: 2026-01-20 15:29:22.938 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Successfully created port: 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.261239) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963261310, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 1553, "num_deletes": 250, "total_data_size": 3741755, "memory_usage": 3797104, "flush_reason": "Manual Compaction"}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963273290, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 1517641, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76710, "largest_seqno": 78257, "table_properties": {"data_size": 1512548, "index_size": 2424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13383, "raw_average_key_size": 21, "raw_value_size": 1501453, "raw_average_value_size": 2357, "num_data_blocks": 108, "num_entries": 637, "num_filter_entries": 637, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922833, "oldest_key_time": 1768922833, "file_creation_time": 1768922963, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 12095 microseconds, and 4587 cpu microseconds.
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.273342) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 1517641 bytes OK
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.273364) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.275926) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.275946) EVENT_LOG_v1 {"time_micros": 1768922963275940, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.275970) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 3734570, prev total WAL file size 3734570, number of live WAL files 2.
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.277264) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353037' seq:72057594037927935, type:22 .. '6D6772737461740032373538' seq:0, type:0; will stop at (end)
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(1482KB)], [156(12MB)]
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963277328, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 14634920, "oldest_snapshot_seqno": -1}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 10030 keys, 11680406 bytes, temperature: kUnknown
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963357147, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 11680406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11617661, "index_size": 36584, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25093, "raw_key_size": 263802, "raw_average_key_size": 26, "raw_value_size": 11443861, "raw_average_value_size": 1140, "num_data_blocks": 1392, "num_entries": 10030, "num_filter_entries": 10030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768922963, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.357387) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 11680406 bytes
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359472) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.2 rd, 146.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.5 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(17.3) write-amplify(7.7) OK, records in: 10488, records dropped: 458 output_compression: NoCompression
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.359490) EVENT_LOG_v1 {"time_micros": 1768922963359482, "job": 100, "event": "compaction_finished", "compaction_time_micros": 79894, "compaction_time_cpu_micros": 33303, "output_level": 6, "num_output_files": 1, "total_output_size": 11680406, "num_input_records": 10488, "num_output_records": 10030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963359921, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768922963362037, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.277117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.362209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.362216) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.362218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.362220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:29:23.362222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:29:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.677 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Successfully updated port: 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.697 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.697 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.697 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.751 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.807 225859 DEBUG nova.compute.manager [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.808 225859 DEBUG nova.compute.manager [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing instance network info cache due to event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:29:23 np0005588919 nova_compute[225855]: 2026-01-20 15:29:23.808 225859 DEBUG oslo_concurrency.lockutils [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:29:24 np0005588919 nova_compute[225855]: 2026-01-20 15:29:24.305 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:24 np0005588919 nova_compute[225855]: 2026-01-20 15:29:24.403 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:29:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:24.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:24.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.429 225859 DEBUG nova.network.neutron [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updating instance_info_cache with network_info: [{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.452 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.453 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Instance network_info: |[{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.454 225859 DEBUG oslo_concurrency.lockutils [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.454 225859 DEBUG nova.network.neutron [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.458 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Start _get_guest_xml network_info=[{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.466 225859 WARNING nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.473 225859 DEBUG nova.virt.libvirt.host [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.474 225859 DEBUG nova.virt.libvirt.host [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.477 225859 DEBUG nova.virt.libvirt.host [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.478 225859 DEBUG nova.virt.libvirt.host [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.479 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.480 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.480 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.481 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.481 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.481 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.481 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.482 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.482 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.482 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.482 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.483 225859 DEBUG nova.virt.hardware [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.485 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:29:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1146710376' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.967 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.993 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:25 np0005588919 nova_compute[225855]: 2026-01-20 15:29:25.997 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:29:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/475366063' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.423 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.425 225859 DEBUG nova.virt.libvirt.vif [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-866255871',display_name='tempest-TestNetworkBasicOps-server-866255871',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-866255871',id=205,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNr82C8ft9iJVaxlnvB559FQY0a18+ddVyVQXOWpubG2vvEKlWos0eribBsrr0XJYAl5WSj1IuEMfruIIC3taSryOm9K9DYcrj57monaPm1w9c08Woz8HGduEiXhkpNC2g==',key_name='tempest-TestNetworkBasicOps-780774492',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-73qpy651',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:29:20Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=af78e376-a9fb-4854-9c34-fd8c6f63390a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:29:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.425 225859 DEBUG nova.network.os_vif_util [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.426 225859 DEBUG nova.network.os_vif_util [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:29:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:26.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.427 225859 DEBUG nova.objects.instance [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid af78e376-a9fb-4854-9c34-fd8c6f63390a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.438 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <uuid>af78e376-a9fb-4854-9c34-fd8c6f63390a</uuid>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <name>instance-000000cd</name>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-866255871</nova:name>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:29:25</nova:creationTime>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <nova:port uuid="31d3cb4c-b75a-468e-8a31-1fad4e27eb6e">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="serial">af78e376-a9fb-4854-9c34-fd8c6f63390a</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="uuid">af78e376-a9fb-4854-9c34-fd8c6f63390a</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/af78e376-a9fb-4854-9c34-fd8c6f63390a_disk">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:d9:69:f6"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <target dev="tap31d3cb4c-b7"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/console.log" append="off"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:29:26 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:29:26 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:29:26 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:29:26 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.440 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Preparing to wait for external event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.441 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.441 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.441 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.442 225859 DEBUG nova.virt.libvirt.vif [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-866255871',display_name='tempest-TestNetworkBasicOps-server-866255871',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-866255871',id=205,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNr82C8ft9iJVaxlnvB559FQY0a18+ddVyVQXOWpubG2vvEKlWos0eribBsrr0XJYAl5WSj1IuEMfruIIC3taSryOm9K9DYcrj57monaPm1w9c08Woz8HGduEiXhkpNC2g==',key_name='tempest-TestNetworkBasicOps-780774492',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-73qpy651',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:29:20Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=af78e376-a9fb-4854-9c34-fd8c6f63390a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.442 225859 DEBUG nova.network.os_vif_util [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.443 225859 DEBUG nova.network.os_vif_util [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.443 225859 DEBUG os_vif [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.444 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.444 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.445 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.449 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.449 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31d3cb4c-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.450 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31d3cb4c-b7, col_values=(('external_ids', {'iface-id': '31d3cb4c-b75a-468e-8a31-1fad4e27eb6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:69:f6', 'vm-uuid': 'af78e376-a9fb-4854-9c34-fd8c6f63390a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.451 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:26 np0005588919 NetworkManager[49104]: <info>  [1768922966.4525] manager: (tap31d3cb4c-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.453 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.459 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.460 225859 INFO os_vif [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7')#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.518 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.518 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.519 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:d9:69:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.519 225859 INFO nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Using config drive#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.544 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:26.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.786 225859 DEBUG nova.network.neutron [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updated VIF entry in instance network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.787 225859 DEBUG nova.network.neutron [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updating instance_info_cache with network_info: [{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.805 225859 DEBUG oslo_concurrency.lockutils [req-a4ecdd67-9f29-43cf-b52a-122317914962 req-3b75d416-57e2-4bb2-9484-49b87fec81e9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.888 225859 INFO nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Creating config drive at /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config#033[00m
Jan 20 10:29:26 np0005588919 nova_compute[225855]: 2026-01-20 15:29:26.893 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph5ayjv58 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.025 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph5ayjv58" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.061 225859 DEBUG nova.storage.rbd_utils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.065 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.509 225859 DEBUG oslo_concurrency.processutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config af78e376-a9fb-4854-9c34-fd8c6f63390a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.510 225859 INFO nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Deleting local config drive /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a/disk.config because it was imported into RBD.#033[00m
Jan 20 10:29:27 np0005588919 kernel: tap31d3cb4c-b7: entered promiscuous mode
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.5723] manager: (tap31d3cb4c-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/399)
Jan 20 10:29:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:27Z|00942|binding|INFO|Claiming lport 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e for this chassis.
Jan 20 10:29:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:27Z|00943|binding|INFO|31d3cb4c-b75a-468e-8a31-1fad4e27eb6e: Claiming fa:16:3e:d9:69:f6 10.100.0.10
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.573 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.578 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.587 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:69:f6 10.100.0.10'], port_security=['fa:16:3e:d9:69:f6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'af78e376-a9fb-4854-9c34-fd8c6f63390a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b49a8c-7446-4445-ae6e-d2870040582f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8d4ade01-8f6f-48ff-bd8a-0af514e9e9ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92632684-a601-487e-937b-036b3fd0bb35, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.589 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e in datapath a2b49a8c-7446-4445-ae6e-d2870040582f bound to our chassis#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.589 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b49a8c-7446-4445-ae6e-d2870040582f#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.600 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a949fd-6b22-47ab-8e92-9d2071ee7fff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.601 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2b49a8c-71 in ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.602 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2b49a8c-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.602 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2371b732-4f09-4963-bc1e-d8d64c5da540]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.603 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb799c5-2311-4974-a441-992e22bc8a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 systemd-udevd[318799]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.614 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[1767c1c4-0a05-4132-9272-cd1feae6fbd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.6222] device (tap31d3cb4c-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.6227] device (tap31d3cb4c-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:29:27 np0005588919 systemd-machined[194361]: New machine qemu-109-instance-000000cd.
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.638 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3b0b65-98ba-4568-887f-cacfe3aecd74]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 systemd[1]: Started Virtual Machine qemu-109-instance-000000cd.
Jan 20 10:29:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:27Z|00944|binding|INFO|Setting lport 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e ovn-installed in OVS
Jan 20 10:29:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:27Z|00945|binding|INFO|Setting lport 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e up in Southbound
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.653 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.668 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[940424ec-6d0b-4951-b1a5-0e65b1fd5fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.6765] manager: (tapa2b49a8c-70): new Veth device (/org/freedesktop/NetworkManager/Devices/400)
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.675 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[99b1fcf4-450c-496e-ac85-b2dbdfd73d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 systemd-udevd[318808]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.709 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[adfbeca7-74b7-4d88-b9bc-bea146668438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.711 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d73e3fc2-8b68-4f3d-ae1d-de8770cb49f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.7348] device (tapa2b49a8c-70): carrier: link connected
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.740 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[367aff37-a7f9-47cd-919d-0598db4ea8b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.757 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[880ee89c-8bc3-414f-adb7-1478068a77f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b49a8c-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:53:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800666, 'reachable_time': 23706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 318882, 'error': None, 'target': 'ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.773 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[daddaeb5-c8f2-4545-9e04-bbdcb7de8b44]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:5337'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 800666, 'tstamp': 800666}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 318884, 'error': None, 'target': 'ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.790 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1fab6ee8-6c73-4ceb-a348-2a942ce5da59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b49a8c-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:53:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800666, 'reachable_time': 23706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 318885, 'error': None, 'target': 'ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.821 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[618c34a4-8566-4d12-96f0-81cb4effd289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.878 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[40f0ae2e-dc9b-4a07-a575-3164a17ab1f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.879 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b49a8c-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.880 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.881 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b49a8c-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:27 np0005588919 kernel: tapa2b49a8c-70: entered promiscuous mode
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.882 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 NetworkManager[49104]: <info>  [1768922967.8834] manager: (tapa2b49a8c-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.885 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b49a8c-70, col_values=(('external_ids', {'iface-id': '83da6236-f092-462b-85f8-aab29a73a3b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:27Z|00946|binding|INFO|Releasing lport 83da6236-f092-462b-85f8-aab29a73a3b5 from this chassis (sb_readonly=0)
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.900 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.902 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2b49a8c-7446-4445-ae6e-d2870040582f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2b49a8c-7446-4445-ae6e-d2870040582f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.903 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[82cecca3-dcb2-4b43-b94d-6f367e8ea3f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.904 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-a2b49a8c-7446-4445-ae6e-d2870040582f
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/a2b49a8c-7446-4445-ae6e-d2870040582f.pid.haproxy
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID a2b49a8c-7446-4445-ae6e-d2870040582f
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:29:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:27.904 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f', 'env', 'PROCESS_TAG=haproxy-a2b49a8c-7446-4445-ae6e-d2870040582f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2b49a8c-7446-4445-ae6e-d2870040582f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.932 225859 DEBUG nova.compute.manager [req-5c67f1d9-02a7-435e-9cb6-8d60cf10bbfd req-0088dba6-0404-44d7-8fa1-757b2e088460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.932 225859 DEBUG oslo_concurrency.lockutils [req-5c67f1d9-02a7-435e-9cb6-8d60cf10bbfd req-0088dba6-0404-44d7-8fa1-757b2e088460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.933 225859 DEBUG oslo_concurrency.lockutils [req-5c67f1d9-02a7-435e-9cb6-8d60cf10bbfd req-0088dba6-0404-44d7-8fa1-757b2e088460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.933 225859 DEBUG oslo_concurrency.lockutils [req-5c67f1d9-02a7-435e-9cb6-8d60cf10bbfd req-0088dba6-0404-44d7-8fa1-757b2e088460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:27 np0005588919 nova_compute[225855]: 2026-01-20 15:29:27.933 225859 DEBUG nova.compute.manager [req-5c67f1d9-02a7-435e-9cb6-8d60cf10bbfd req-0088dba6-0404-44d7-8fa1-757b2e088460 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Processing event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.100 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922968.1003408, af78e376-a9fb-4854-9c34-fd8c6f63390a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.106 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] VM Started (Lifecycle Event)#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.108 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.111 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.115 225859 INFO nova.virt.libvirt.driver [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Instance spawned successfully.#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.116 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.129 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.133 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.134 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.135 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.135 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.136 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.136 225859 DEBUG nova.virt.libvirt.driver [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.142 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.175 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.176 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922968.1013553, af78e376-a9fb-4854-9c34-fd8c6f63390a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.176 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.200 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.204 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768922968.1111882, af78e376-a9fb-4854-9c34-fd8c6f63390a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.204 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.211 225859 INFO nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Took 7.43 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.212 225859 DEBUG nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.220 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.224 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:29:28 np0005588919 podman[318955]: 2026-01-20 15:29:28.279353067 +0000 UTC m=+0.048540277 container create 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.295 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:29:28 np0005588919 systemd[1]: Started libpod-conmon-915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08.scope.
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.319 225859 INFO nova.compute.manager [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Took 8.46 seconds to build instance.#033[00m
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.338 225859 DEBUG oslo_concurrency.lockutils [None req-be45cfa3-d78e-4f5c-8d98-77fe199d6f8b 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:28 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:29:28 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbfcd70bae3d1e8aa8f666a8626fbab06db564cbad7eb171c298ad570d143684/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:29:28 np0005588919 podman[318955]: 2026-01-20 15:29:28.254768126 +0000 UTC m=+0.023955346 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:29:28 np0005588919 podman[318955]: 2026-01-20 15:29:28.358304892 +0000 UTC m=+0.127492142 container init 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 20 10:29:28 np0005588919 podman[318955]: 2026-01-20 15:29:28.365497467 +0000 UTC m=+0.134684687 container start 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:29:28 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [NOTICE]   (318975) : New worker (318977) forked
Jan 20 10:29:28 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [NOTICE]   (318975) : Loading success.
Jan 20 10:29:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:28.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:29:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:28.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:28 np0005588919 nova_compute[225855]: 2026-01-20 15:29:28.870 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.028 225859 DEBUG nova.compute.manager [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.028 225859 DEBUG oslo_concurrency.lockutils [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.029 225859 DEBUG oslo_concurrency.lockutils [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.029 225859 DEBUG oslo_concurrency.lockutils [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.029 225859 DEBUG nova.compute.manager [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] No waiting events found dispatching network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:29:30 np0005588919 nova_compute[225855]: 2026-01-20 15:29:30.029 225859 WARNING nova.compute.manager [req-17701128-912e-4c7d-b82b-5d0e0fcd35ff req-7403f88e-5b11-42b9-bb55-0ba0839ec774 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received unexpected event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e for instance with vm_state active and task_state None.#033[00m
Jan 20 10:29:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:30.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:30.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.452 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.610 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:31 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:31Z|00947|binding|INFO|Releasing lport 83da6236-f092-462b-85f8-aab29a73a3b5 from this chassis (sb_readonly=0)
Jan 20 10:29:31 np0005588919 NetworkManager[49104]: <info>  [1768922971.6138] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Jan 20 10:29:31 np0005588919 NetworkManager[49104]: <info>  [1768922971.6147] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:31 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:31Z|00948|binding|INFO|Releasing lport 83da6236-f092-462b-85f8-aab29a73a3b5 from this chassis (sb_readonly=0)
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.649 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.904 225859 DEBUG nova.compute.manager [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.905 225859 DEBUG nova.compute.manager [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing instance network info cache due to event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.905 225859 DEBUG oslo_concurrency.lockutils [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.905 225859 DEBUG oslo_concurrency.lockutils [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:29:31 np0005588919 nova_compute[225855]: 2026-01-20 15:29:31.906 225859 DEBUG nova.network.neutron [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:29:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:32.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:32.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:33 np0005588919 nova_compute[225855]: 2026-01-20 15:29:33.872 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:33 np0005588919 nova_compute[225855]: 2026-01-20 15:29:33.930 225859 DEBUG nova.network.neutron [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updated VIF entry in instance network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:29:33 np0005588919 nova_compute[225855]: 2026-01-20 15:29:33.931 225859 DEBUG nova.network.neutron [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updating instance_info_cache with network_info: [{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:29:33 np0005588919 nova_compute[225855]: 2026-01-20 15:29:33.955 225859 DEBUG oslo_concurrency.lockutils [req-dcdbea0a-b4f6-4b85-8454-80c7c0cc24ce req-2a525e4c-1e17-49ad-813f-15647973df12 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:29:34 np0005588919 nova_compute[225855]: 2026-01-20 15:29:34.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:34.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:34.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:35 np0005588919 nova_compute[225855]: 2026-01-20 15:29:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:35 np0005588919 nova_compute[225855]: 2026-01-20 15:29:35.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:29:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:36.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:36 np0005588919 nova_compute[225855]: 2026-01-20 15:29:36.454 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:36.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:38 np0005588919 podman[318992]: 2026-01-20 15:29:38.070689232 +0000 UTC m=+0.119192134 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:29:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:38.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:38.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:38 np0005588919 nova_compute[225855]: 2026-01-20 15:29:38.874 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:40 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 20 10:29:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:40.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:40.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:41 np0005588919 nova_compute[225855]: 2026-01-20 15:29:41.503 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:42Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:69:f6 10.100.0.10
Jan 20 10:29:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:42Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:69:f6 10.100.0.10
Jan 20 10:29:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:42.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:42.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:43 np0005588919 nova_compute[225855]: 2026-01-20 15:29:43.924 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:44.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:44.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:46.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:46 np0005588919 nova_compute[225855]: 2026-01-20 15:29:46.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:46.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:48.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:29:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:48.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:29:48 np0005588919 nova_compute[225855]: 2026-01-20 15:29:48.647 225859 INFO nova.compute.manager [None req-8d4a16ca-c04e-4973-b930-2615e689c44d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Get console output#033[00m
Jan 20 10:29:48 np0005588919 nova_compute[225855]: 2026-01-20 15:29:48.652 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:29:48 np0005588919 nova_compute[225855]: 2026-01-20 15:29:48.926 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:49 np0005588919 podman[319074]: 2026-01-20 15:29:49.01171139 +0000 UTC m=+0.052401357 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 20 10:29:49 np0005588919 nova_compute[225855]: 2026-01-20 15:29:49.622 225859 INFO nova.compute.manager [None req-eead650f-1964-4b99-98dc-d03510545f6a 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Get console output#033[00m
Jan 20 10:29:49 np0005588919 nova_compute[225855]: 2026-01-20 15:29:49.626 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.362 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.363 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.381 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.439 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.439 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.440 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:29:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:50.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:50.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.699 225859 DEBUG nova.compute.manager [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.700 225859 DEBUG nova.compute.manager [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing instance network info cache due to event network-changed-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.700 225859 DEBUG oslo_concurrency.lockutils [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.700 225859 DEBUG oslo_concurrency.lockutils [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.700 225859 DEBUG nova.network.neutron [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Refreshing network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.790 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.791 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.791 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.792 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.792 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.793 225859 INFO nova.compute.manager [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Terminating instance#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.795 225859 DEBUG nova.compute.manager [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:29:50 np0005588919 kernel: tap31d3cb4c-b7 (unregistering): left promiscuous mode
Jan 20 10:29:50 np0005588919 NetworkManager[49104]: <info>  [1768922990.8508] device (tap31d3cb4c-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:50Z|00949|binding|INFO|Releasing lport 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e from this chassis (sb_readonly=0)
Jan 20 10:29:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:50Z|00950|binding|INFO|Setting lport 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e down in Southbound
Jan 20 10:29:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:29:50Z|00951|binding|INFO|Removing iface tap31d3cb4c-b7 ovn-installed in OVS
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.863 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.869 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:69:f6 10.100.0.10'], port_security=['fa:16:3e:d9:69:f6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'af78e376-a9fb-4854-9c34-fd8c6f63390a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b49a8c-7446-4445-ae6e-d2870040582f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d4ade01-8f6f-48ff-bd8a-0af514e9e9ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92632684-a601-487e-937b-036b3fd0bb35, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.871 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e in datapath a2b49a8c-7446-4445-ae6e-d2870040582f unbound from our chassis#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.872 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2b49a8c-7446-4445-ae6e-d2870040582f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.873 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d14cb7-39ae-4f63-8e3f-12770c9e0ce2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:50.873 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f namespace which is not needed anymore#033[00m
Jan 20 10:29:50 np0005588919 nova_compute[225855]: 2026-01-20 15:29:50.880 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:50 np0005588919 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d000000cd.scope: Deactivated successfully.
Jan 20 10:29:50 np0005588919 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d000000cd.scope: Consumed 13.716s CPU time.
Jan 20 10:29:50 np0005588919 systemd-machined[194361]: Machine qemu-109-instance-000000cd terminated.
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.033 225859 INFO nova.virt.libvirt.driver [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Instance destroyed successfully.#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.035 225859 DEBUG nova.objects.instance [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid af78e376-a9fb-4854-9c34-fd8c6f63390a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.051 225859 DEBUG nova.virt.libvirt.vif [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-866255871',display_name='tempest-TestNetworkBasicOps-server-866255871',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-866255871',id=205,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNr82C8ft9iJVaxlnvB559FQY0a18+ddVyVQXOWpubG2vvEKlWos0eribBsrr0XJYAl5WSj1IuEMfruIIC3taSryOm9K9DYcrj57monaPm1w9c08Woz8HGduEiXhkpNC2g==',key_name='tempest-TestNetworkBasicOps-780774492',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:29:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-73qpy651',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:29:28Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=af78e376-a9fb-4854-9c34-fd8c6f63390a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.051 225859 DEBUG nova.network.os_vif_util [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.052 225859 DEBUG nova.network.os_vif_util [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.052 225859 DEBUG os_vif [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.053 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.054 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31d3cb4c-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.055 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.056 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.059 225859 INFO os_vif [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:69:f6,bridge_name='br-int',has_traffic_filtering=True,id=31d3cb4c-b75a-468e-8a31-1fad4e27eb6e,network=Network(a2b49a8c-7446-4445-ae6e-d2870040582f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31d3cb4c-b7')#033[00m
Jan 20 10:29:51 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [NOTICE]   (318975) : haproxy version is 2.8.14-c23fe91
Jan 20 10:29:51 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [NOTICE]   (318975) : path to executable is /usr/sbin/haproxy
Jan 20 10:29:51 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [WARNING]  (318975) : Exiting Master process...
Jan 20 10:29:51 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [ALERT]    (318975) : Current worker (318977) exited with code 143 (Terminated)
Jan 20 10:29:51 np0005588919 neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f[318971]: [WARNING]  (318975) : All workers exited. Exiting... (0)
Jan 20 10:29:51 np0005588919 systemd[1]: libpod-915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08.scope: Deactivated successfully.
Jan 20 10:29:51 np0005588919 podman[319114]: 2026-01-20 15:29:51.177105624 +0000 UTC m=+0.217167672 container died 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:29:51 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08-userdata-shm.mount: Deactivated successfully.
Jan 20 10:29:51 np0005588919 systemd[1]: var-lib-containers-storage-overlay-fbfcd70bae3d1e8aa8f666a8626fbab06db564cbad7eb171c298ad570d143684-merged.mount: Deactivated successfully.
Jan 20 10:29:51 np0005588919 podman[319114]: 2026-01-20 15:29:51.6848169 +0000 UTC m=+0.724878988 container cleanup 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:29:51 np0005588919 podman[319172]: 2026-01-20 15:29:51.945277307 +0000 UTC m=+0.237131462 container remove 915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:29:51 np0005588919 systemd[1]: libpod-conmon-915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08.scope: Deactivated successfully.
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.951 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1d5113-0bd4-4ea7-b698-d14329868e5a]: (4, ('Tue Jan 20 03:29:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f (915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08)\n915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08\nTue Jan 20 03:29:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f (915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08)\n915250c82fb88665cde5846eb4724025c3659abf7c6170479e5702a9e6426c08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.953 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[16523bb2-54b1-4d92-835b-d8f7d8124b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.954 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b49a8c-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.956 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:51 np0005588919 kernel: tapa2b49a8c-70: left promiscuous mode
Jan 20 10:29:51 np0005588919 nova_compute[225855]: 2026-01-20 15:29:51.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.973 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e930ae52-1890-448c-adaf-7c162d26bda0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.989 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c24beb-d864-4908-b2a5-75a625c1fa25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:51.990 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c5bdfa3e-9110-4a2f-bf8f-60f7cc3f8f9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:52.006 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0f743c32-f459-4e48-a0ff-cf4811497988]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 800659, 'reachable_time': 35451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319191, 'error': None, 'target': 'ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:52.010 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2b49a8c-7446-4445-ae6e-d2870040582f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:29:52 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:52.010 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[8bdcfcac-4423-461e-814d-8ca007162627]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:29:52 np0005588919 systemd[1]: run-netns-ovnmeta\x2da2b49a8c\x2d7446\x2d4445\x2dae6e\x2dd2870040582f.mount: Deactivated successfully.
Jan 20 10:29:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:52.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.512 225859 INFO nova.virt.libvirt.driver [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Deleting instance files /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a_del#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.513 225859 INFO nova.virt.libvirt.driver [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Deletion of /var/lib/nova/instances/af78e376-a9fb-4854-9c34-fd8c6f63390a_del complete#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.583 225859 INFO nova.compute.manager [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Took 1.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.583 225859 DEBUG oslo.service.loopingcall [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.584 225859 DEBUG nova.compute.manager [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.584 225859 DEBUG nova.network.neutron [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:29:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:29:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:52.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.797 225859 DEBUG nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-unplugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.798 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.798 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.798 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.798 225859 DEBUG nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] No waiting events found dispatching network-vif-unplugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.799 225859 DEBUG nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-unplugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.799 225859 DEBUG nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.799 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.799 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.799 225859 DEBUG oslo_concurrency.lockutils [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.800 225859 DEBUG nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] No waiting events found dispatching network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.800 225859 WARNING nova.compute.manager [req-50d574f9-bd10-4996-9b23-30b8ffc88df2 req-57c2b035-8abf-4800-a510-e2f818b7f616 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received unexpected event network-vif-plugged-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.843 225859 DEBUG nova.network.neutron [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updated VIF entry in instance network info cache for port 31d3cb4c-b75a-468e-8a31-1fad4e27eb6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.844 225859 DEBUG nova.network.neutron [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updating instance_info_cache with network_info: [{"id": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "address": "fa:16:3e:d9:69:f6", "network": {"id": "a2b49a8c-7446-4445-ae6e-d2870040582f", "bridge": "br-int", "label": "tempest-network-smoke--338330285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31d3cb4c-b7", "ovs_interfaceid": "31d3cb4c-b75a-468e-8a31-1fad4e27eb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:29:52 np0005588919 nova_compute[225855]: 2026-01-20 15:29:52.866 225859 DEBUG oslo_concurrency.lockutils [req-ddc73362-1fdd-47fd-883f-53ae0e138efb req-d1b0c495-3541-4df6-9a1d-8c0ba5786bd7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-af78e376-a9fb-4854-9c34-fd8c6f63390a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.176 225859 DEBUG nova.network.neutron [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.194 225859 INFO nova.compute.manager [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Took 0.61 seconds to deallocate network for instance.#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.275 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.276 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.284 225859 DEBUG nova.compute.manager [req-42e94f68-0bcc-42b5-928a-7cc11bc028bd req-e32af194-96c2-42bc-bebd-48e4ea18a59c 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Received event network-vif-deleted-31d3cb4c-b75a-468e-8a31-1fad4e27eb6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.324 225859 DEBUG oslo_concurrency.processutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:29:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:29:53 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2846343753' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.771 225859 DEBUG oslo_concurrency.processutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.778 225859 DEBUG nova.compute.provider_tree [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.804 225859 DEBUG nova.scheduler.client.report [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.830 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.874 225859 INFO nova.scheduler.client.report [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance af78e376-a9fb-4854-9c34-fd8c6f63390a#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:53 np0005588919 nova_compute[225855]: 2026-01-20 15:29:53.963 225859 DEBUG oslo_concurrency.lockutils [None req-be7a7344-e8aa-43c5-ad00-ad230acbd8fc 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "af78e376-a9fb-4854-9c34-fd8c6f63390a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:29:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:54.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:54.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:56 np0005588919 nova_compute[225855]: 2026-01-20 15:29:56.056 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:56 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:29:56.442 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:29:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:56.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:56.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:29:58.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:29:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:29:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:29:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:29:58.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:29:58 np0005588919 nova_compute[225855]: 2026-01-20 15:29:58.932 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:59 np0005588919 nova_compute[225855]: 2026-01-20 15:29:59.909 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:29:59 np0005588919 nova_compute[225855]: 2026-01-20 15:29:59.989 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:00.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:00.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:30:01 np0005588919 nova_compute[225855]: 2026-01-20 15:30:01.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:02.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:03 np0005588919 nova_compute[225855]: 2026-01-20 15:30:03.934 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:04.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:04.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:06 np0005588919 nova_compute[225855]: 2026-01-20 15:30:06.033 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768922991.0306811, af78e376-a9fb-4854-9c34-fd8c6f63390a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:30:06 np0005588919 nova_compute[225855]: 2026-01-20 15:30:06.034 225859 INFO nova.compute.manager [-] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:30:06 np0005588919 nova_compute[225855]: 2026-01-20 15:30:06.061 225859 DEBUG nova.compute.manager [None req-04926d27-d190-4abc-8744-3cdfa7cf4ed1 - - - - - -] [instance: af78e376-a9fb-4854-9c34-fd8c6f63390a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:30:06 np0005588919 nova_compute[225855]: 2026-01-20 15:30:06.062 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:06.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:06.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:08 np0005588919 nova_compute[225855]: 2026-01-20 15:30:08.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:08 np0005588919 nova_compute[225855]: 2026-01-20 15:30:08.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:30:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:08.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:08.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:08 np0005588919 nova_compute[225855]: 2026-01-20 15:30:08.935 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:09 np0005588919 podman[319275]: 2026-01-20 15:30:09.032539368 +0000 UTC m=+0.078861022 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 20 10:30:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:10.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:10.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.063 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.374 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.374 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:11 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:30:11 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3373613097' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.816 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.956 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.957 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.957 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:11 np0005588919 nova_compute[225855]: 2026-01-20 15:30:11.958 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.027 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.027 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.042 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:30:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1559663311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.475 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.480 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:30:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.504 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.547 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:30:12 np0005588919 nova_compute[225855]: 2026-01-20 15:30:12.547 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:12.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:30:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/705460975' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:30:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:30:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/705460975' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:30:13 np0005588919 nova_compute[225855]: 2026-01-20 15:30:13.937 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:14 np0005588919 nova_compute[225855]: 2026-01-20 15:30:14.097 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:14 np0005588919 nova_compute[225855]: 2026-01-20 15:30:14.097 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:30:14 np0005588919 nova_compute[225855]: 2026-01-20 15:30:14.097 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:30:14 np0005588919 nova_compute[225855]: 2026-01-20 15:30:14.116 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:30:14 np0005588919 nova_compute[225855]: 2026-01-20 15:30:14.117 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:14.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:14.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:15 np0005588919 nova_compute[225855]: 2026-01-20 15:30:15.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:15 np0005588919 nova_compute[225855]: 2026-01-20 15:30:15.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:15 np0005588919 nova_compute[225855]: 2026-01-20 15:30:15.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:15 np0005588919 nova_compute[225855]: 2026-01-20 15:30:15.354 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:16 np0005588919 nova_compute[225855]: 2026-01-20 15:30:16.065 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:16.452 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:16.452 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:16.452 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:16.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:16.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:17 np0005588919 nova_compute[225855]: 2026-01-20 15:30:17.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:18.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:18.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:18 np0005588919 nova_compute[225855]: 2026-01-20 15:30:18.939 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:20 np0005588919 podman[319353]: 2026-01-20 15:30:20.008796412 +0000 UTC m=+0.048846116 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 20 10:30:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:20.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.087 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.379 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.379 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.399 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.486 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.487 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.495 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.495 225859 INFO nova.compute.claims [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:30:21 np0005588919 nova_compute[225855]: 2026-01-20 15:30:21.684 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:30:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1059682167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.105 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.111 225859 DEBUG nova.compute.provider_tree [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.135 225859 DEBUG nova.scheduler.client.report [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.175 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.176 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.247 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.247 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.273 225859 INFO nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.290 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.386 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.387 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.387 225859 INFO nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Creating image(s)#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.413 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.438 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.464 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.468 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.496 225859 DEBUG nova.policy [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:30:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:22.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.535 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.536 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.536 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.537 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.563 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.567 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:22.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:22 np0005588919 nova_compute[225855]: 2026-01-20 15:30:22.920 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.005 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.125 225859 DEBUG nova.objects.instance [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid c2074d47-58a3-49e8-82fd-6bc6145a1ea7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.153 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.154 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Ensure instance console log exists: /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.155 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.155 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.155 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.881 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Successfully created port: 60202d18-26b2-493b-a427-211cda112a80 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:30:23 np0005588919 nova_compute[225855]: 2026-01-20 15:30:23.941 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:30:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:24.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:24.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.695 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Successfully updated port: 60202d18-26b2-493b-a427-211cda112a80 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.716 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.716 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.716 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.813 225859 DEBUG nova.compute.manager [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-changed-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.814 225859 DEBUG nova.compute.manager [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing instance network info cache due to event network-changed-60202d18-26b2-493b-a427-211cda112a80. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.814 225859 DEBUG oslo_concurrency.lockutils [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:30:24 np0005588919 nova_compute[225855]: 2026-01-20 15:30:24.857 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.717 225859 DEBUG nova.network.neutron [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.737 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.737 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Instance network_info: |[{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.738 225859 DEBUG oslo_concurrency.lockutils [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.738 225859 DEBUG nova.network.neutron [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing network info cache for port 60202d18-26b2-493b-a427-211cda112a80 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.740 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Start _get_guest_xml network_info=[{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.745 225859 WARNING nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.755 225859 DEBUG nova.virt.libvirt.host [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.756 225859 DEBUG nova.virt.libvirt.host [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.759 225859 DEBUG nova.virt.libvirt.host [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.759 225859 DEBUG nova.virt.libvirt.host [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.760 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.761 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.761 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.761 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.762 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.762 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.762 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.762 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.763 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.763 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.763 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.763 225859 DEBUG nova.virt.hardware [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:30:25 np0005588919 nova_compute[225855]: 2026-01-20 15:30:25.766 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:30:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2182023721' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.211 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.236 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.239 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:26.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:26.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:30:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3549875571' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.696 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.698 225859 DEBUG nova.virt.libvirt.vif [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-761395495',display_name='tempest-TestNetworkBasicOps-server-761395495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-761395495',id=206,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6PcyU5b6KJgECZvP75RVISUjV8spB81h3nAjsUONZi4KISBeJ3H+m9LFQCp72IhdPL4TNE6iitZI83oIzTSr0WLM1hF9NfU7ED77LiXjCqrZKn4HPslanwlp/Qjc+bCQ==',key_name='tempest-TestNetworkBasicOps-1783413232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-mwqa18l0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:30:22Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=c2074d47-58a3-49e8-82fd-6bc6145a1ea7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.699 225859 DEBUG nova.network.os_vif_util [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.700 225859 DEBUG nova.network.os_vif_util [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.701 225859 DEBUG nova.objects.instance [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid c2074d47-58a3-49e8-82fd-6bc6145a1ea7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.723 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <uuid>c2074d47-58a3-49e8-82fd-6bc6145a1ea7</uuid>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <name>instance-000000ce</name>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-761395495</nova:name>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:30:25</nova:creationTime>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <nova:port uuid="60202d18-26b2-493b-a427-211cda112a80">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="serial">c2074d47-58a3-49e8-82fd-6bc6145a1ea7</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="uuid">c2074d47-58a3-49e8-82fd-6bc6145a1ea7</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:36:f9:3a"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <target dev="tap60202d18-26"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/console.log" append="off"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:30:26 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:30:26 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:30:26 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:30:26 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.725 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Preparing to wait for external event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.726 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.727 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.727 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.728 225859 DEBUG nova.virt.libvirt.vif [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-761395495',display_name='tempest-TestNetworkBasicOps-server-761395495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-761395495',id=206,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6PcyU5b6KJgECZvP75RVISUjV8spB81h3nAjsUONZi4KISBeJ3H+m9LFQCp72IhdPL4TNE6iitZI83oIzTSr0WLM1hF9NfU7ED77LiXjCqrZKn4HPslanwlp/Qjc+bCQ==',key_name='tempest-TestNetworkBasicOps-1783413232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-mwqa18l0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:30:22Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=c2074d47-58a3-49e8-82fd-6bc6145a1ea7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.729 225859 DEBUG nova.network.os_vif_util [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.730 225859 DEBUG nova.network.os_vif_util [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.731 225859 DEBUG os_vif [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.731 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.732 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.733 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.738 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.739 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60202d18-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.739 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60202d18-26, col_values=(('external_ids', {'iface-id': '60202d18-26b2-493b-a427-211cda112a80', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:f9:3a', 'vm-uuid': 'c2074d47-58a3-49e8-82fd-6bc6145a1ea7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.741 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:26 np0005588919 NetworkManager[49104]: <info>  [1768923026.7426] manager: (tap60202d18-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.745 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.749 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.751 225859 INFO os_vif [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26')#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.809 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.809 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.810 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:36:f9:3a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.810 225859 INFO nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Using config drive#033[00m
Jan 20 10:30:26 np0005588919 nova_compute[225855]: 2026-01-20 15:30:26.836 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:28 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.468 225859 INFO nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Creating config drive at /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config#033[00m
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.474 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkcze09k3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:28.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.615 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkcze09k3" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:28.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.641 225859 DEBUG nova.storage.rbd_utils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.645 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.792 225859 DEBUG oslo_concurrency.processutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config c2074d47-58a3-49e8-82fd-6bc6145a1ea7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.793 225859 INFO nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Deleting local config drive /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7/disk.config because it was imported into RBD.#033[00m
Jan 20 10:30:28 np0005588919 kernel: tap60202d18-26: entered promiscuous mode
Jan 20 10:30:28 np0005588919 NetworkManager[49104]: <info>  [1768923028.8440] manager: (tap60202d18-26): new Tun device (/org/freedesktop/NetworkManager/Devices/405)
Jan 20 10:30:28 np0005588919 systemd-udevd[319982]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:30:28 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:28Z|00952|binding|INFO|Claiming lport 60202d18-26b2-493b-a427-211cda112a80 for this chassis.
Jan 20 10:30:28 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:28Z|00953|binding|INFO|60202d18-26b2-493b-a427-211cda112a80: Claiming fa:16:3e:36:f9:3a 10.100.0.11
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.907 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.913 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.919 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f9:3a 10.100.0.11'], port_security=['fa:16:3e:36:f9:3a 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c2074d47-58a3-49e8-82fd-6bc6145a1ea7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0228362f-0ced-4cac-bb89-96bd472df47f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6458a221-63f1-42cc-b15d-f9334e60cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=306abd7d-c001-4e00-b2a1-8a251fd6a022, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=60202d18-26b2-493b-a427-211cda112a80) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:30:28 np0005588919 NetworkManager[49104]: <info>  [1768923028.9212] device (tap60202d18-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:30:28 np0005588919 NetworkManager[49104]: <info>  [1768923028.9221] device (tap60202d18-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.921 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 60202d18-26b2-493b-a427-211cda112a80 in datapath 0228362f-0ced-4cac-bb89-96bd472df47f bound to our chassis#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.922 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0228362f-0ced-4cac-bb89-96bd472df47f#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.934 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e777e4bd-74f5-4d33-a6f8-df6bcee95bd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.936 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0228362f-01 in ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:30:28 np0005588919 systemd-machined[194361]: New machine qemu-110-instance-000000ce.
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.939 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0228362f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.939 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d084c6bf-f9d2-4007-af31-a3d73ef3c12f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.941 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5f2d92-cc24-4f39-80c5-1fa896f00631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.952 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[00ca2826-f65f-4b49-bb4c-e167b8a9f39c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:28 np0005588919 systemd[1]: Started Virtual Machine qemu-110-instance-000000ce.
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.976 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:28 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:28.976 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[26aa4ddd-83cc-45dd-b735-563637cdf729]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:28 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:28Z|00954|binding|INFO|Setting lport 60202d18-26b2-493b-a427-211cda112a80 ovn-installed in OVS
Jan 20 10:30:28 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:28Z|00955|binding|INFO|Setting lport 60202d18-26b2-493b-a427-211cda112a80 up in Southbound
Jan 20 10:30:28 np0005588919 nova_compute[225855]: 2026-01-20 15:30:28.982 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.004 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed21a24-252d-4be7-93da-e7fec78b60fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 NetworkManager[49104]: <info>  [1768923029.0099] manager: (tap0228362f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/406)
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.010 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5eab8e15-4c09-4eed-a653-94646c80d377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 systemd-udevd[319985]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.040 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[10711aec-8c7b-4257-8fe8-1a35f01ae979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.044 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[5246c544-e486-4e67-8782-232f8fca113d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 NetworkManager[49104]: <info>  [1768923029.0663] device (tap0228362f-00): carrier: link connected
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.071 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[91e55b80-80f3-4292-bc75-75d2b3e0d3fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.089 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d108989f-ab76-46e5-a438-ff0a9766209b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0228362f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:13:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 806799, 'reachable_time': 22382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 320032, 'error': None, 'target': 'ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.105 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[28b5778b-3ce3-44a0-9ad4-8d3a01955b68]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:1371'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 806799, 'tstamp': 806799}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 320033, 'error': None, 'target': 'ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.123 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1fba40-8842-46e5-9813-75e73c06d9fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0228362f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:13:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 271], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 806799, 'reachable_time': 22382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 320034, 'error': None, 'target': 'ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.156 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[987b5d2f-1aa3-4e96-bea3-a0796dd7cc36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.217 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[224dd069-30a9-4cef-a287-4b0764017f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.219 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0228362f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.219 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.219 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0228362f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:29 np0005588919 NetworkManager[49104]: <info>  [1768923029.2224] manager: (tap0228362f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/407)
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.221 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588919 kernel: tap0228362f-00: entered promiscuous mode
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.226 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0228362f-00, col_values=(('external_ids', {'iface-id': 'cd551c37-a4a7-45aa-9507-04cb570a94af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:29 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:29Z|00956|binding|INFO|Releasing lport cd551c37-a4a7-45aa-9507-04cb570a94af from this chassis (sb_readonly=0)
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.240 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.241 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0228362f-0ced-4cac-bb89-96bd472df47f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0228362f-0ced-4cac-bb89-96bd472df47f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.242 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aa51eede-28b4-4166-bc5e-492f796560e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.243 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-0228362f-0ced-4cac-bb89-96bd472df47f
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/0228362f-0ced-4cac-bb89-96bd472df47f.pid.haproxy
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 0228362f-0ced-4cac-bb89-96bd472df47f
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:30:29 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:29.243 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f', 'env', 'PROCESS_TAG=haproxy-0228362f-0ced-4cac-bb89-96bd472df47f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0228362f-0ced-4cac-bb89-96bd472df47f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.386 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923029.3858116, c2074d47-58a3-49e8-82fd-6bc6145a1ea7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.386 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] VM Started (Lifecycle Event)#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.404 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.408 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923029.3886147, c2074d47-58a3-49e8-82fd-6bc6145a1ea7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.408 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.421 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.423 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.449 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.496 225859 DEBUG nova.network.neutron [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updated VIF entry in instance network info cache for port 60202d18-26b2-493b-a427-211cda112a80. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.498 225859 DEBUG nova.network.neutron [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.519 225859 DEBUG oslo_concurrency.lockutils [req-1b80bc76-6b49-49d2-bbca-9eef0c3893a0 req-4d7a5a9d-73dc-4561-8a95-d6c3b56d899d 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.627 225859 DEBUG nova.compute.manager [req-9e269883-a09d-4e62-ac7a-ddc280c986d4 req-0493deaa-8ae2-42e6-8b5c-8d842fa2b686 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.627 225859 DEBUG oslo_concurrency.lockutils [req-9e269883-a09d-4e62-ac7a-ddc280c986d4 req-0493deaa-8ae2-42e6-8b5c-8d842fa2b686 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.627 225859 DEBUG oslo_concurrency.lockutils [req-9e269883-a09d-4e62-ac7a-ddc280c986d4 req-0493deaa-8ae2-42e6-8b5c-8d842fa2b686 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.627 225859 DEBUG oslo_concurrency.lockutils [req-9e269883-a09d-4e62-ac7a-ddc280c986d4 req-0493deaa-8ae2-42e6-8b5c-8d842fa2b686 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.628 225859 DEBUG nova.compute.manager [req-9e269883-a09d-4e62-ac7a-ddc280c986d4 req-0493deaa-8ae2-42e6-8b5c-8d842fa2b686 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Processing event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.628 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.631 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923029.6315727, c2074d47-58a3-49e8-82fd-6bc6145a1ea7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.632 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.633 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.636 225859 INFO nova.virt.libvirt.driver [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Instance spawned successfully.#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.637 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.654 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.659 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:30:29 np0005588919 podman[320105]: 2026-01-20 15:30:29.566105305 +0000 UTC m=+0.022763971 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.663 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.663 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.663 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.664 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.664 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.665 225859 DEBUG nova.virt.libvirt.driver [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.697 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.737 225859 INFO nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Took 7.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.737 225859 DEBUG nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.801 225859 INFO nova.compute.manager [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Took 8.35 seconds to build instance.#033[00m
Jan 20 10:30:29 np0005588919 nova_compute[225855]: 2026-01-20 15:30:29.814 225859 DEBUG oslo_concurrency.lockutils [None req-7a7d2c37-e6b7-40f0-9877-6a6ac3ea8422 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:30:29 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:30:30 np0005588919 podman[320105]: 2026-01-20 15:30:30.044856104 +0000 UTC m=+0.501514750 container create 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 20 10:30:30 np0005588919 systemd[1]: Started libpod-conmon-220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43.scope.
Jan 20 10:30:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:30.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:30 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:30:30 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5492aad4f22230a780618b9ba63a9c548a97b7ff7061366af2c207314fa31229/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:30:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:30.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:30 np0005588919 podman[320105]: 2026-01-20 15:30:30.676743845 +0000 UTC m=+1.133402511 container init 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 20 10:30:30 np0005588919 podman[320105]: 2026-01-20 15:30:30.68320003 +0000 UTC m=+1.139858676 container start 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:30:30 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [NOTICE]   (320126) : New worker (320128) forked
Jan 20 10:30:30 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [NOTICE]   (320126) : Loading success.
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.721 225859 DEBUG nova.compute.manager [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.721 225859 DEBUG oslo_concurrency.lockutils [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.721 225859 DEBUG oslo_concurrency.lockutils [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.722 225859 DEBUG oslo_concurrency.lockutils [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.722 225859 DEBUG nova.compute.manager [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] No waiting events found dispatching network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.722 225859 WARNING nova.compute.manager [req-726f0f16-0571-471f-896e-f046fcdf8f33 req-dc19859c-4a73-41ee-b99f-e7f335223ae0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received unexpected event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:30:31 np0005588919 nova_compute[225855]: 2026-01-20 15:30:31.742 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:32.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:32.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:33 np0005588919 nova_compute[225855]: 2026-01-20 15:30:33.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:30:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:30:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:34.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:34 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:34Z|00957|binding|INFO|Releasing lport cd551c37-a4a7-45aa-9507-04cb570a94af from this chassis (sb_readonly=0)
Jan 20 10:30:34 np0005588919 nova_compute[225855]: 2026-01-20 15:30:34.919 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:34 np0005588919 NetworkManager[49104]: <info>  [1768923034.9238] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Jan 20 10:30:34 np0005588919 NetworkManager[49104]: <info>  [1768923034.9247] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Jan 20 10:30:34 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:34Z|00958|binding|INFO|Releasing lport cd551c37-a4a7-45aa-9507-04cb570a94af from this chassis (sb_readonly=0)
Jan 20 10:30:34 np0005588919 nova_compute[225855]: 2026-01-20 15:30:34.954 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:34 np0005588919 nova_compute[225855]: 2026-01-20 15:30:34.958 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:35 np0005588919 nova_compute[225855]: 2026-01-20 15:30:35.915 225859 DEBUG nova.compute.manager [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-changed-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:30:35 np0005588919 nova_compute[225855]: 2026-01-20 15:30:35.915 225859 DEBUG nova.compute.manager [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing instance network info cache due to event network-changed-60202d18-26b2-493b-a427-211cda112a80. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:30:35 np0005588919 nova_compute[225855]: 2026-01-20 15:30:35.915 225859 DEBUG oslo_concurrency.lockutils [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:30:35 np0005588919 nova_compute[225855]: 2026-01-20 15:30:35.916 225859 DEBUG oslo_concurrency.lockutils [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:30:35 np0005588919 nova_compute[225855]: 2026-01-20 15:30:35.916 225859 DEBUG nova.network.neutron [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing network info cache for port 60202d18-26b2-493b-a427-211cda112a80 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:30:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:36.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:36.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:36 np0005588919 nova_compute[225855]: 2026-01-20 15:30:36.745 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:37 np0005588919 nova_compute[225855]: 2026-01-20 15:30:37.508 225859 DEBUG nova.network.neutron [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updated VIF entry in instance network info cache for port 60202d18-26b2-493b-a427-211cda112a80. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:30:37 np0005588919 nova_compute[225855]: 2026-01-20 15:30:37.509 225859 DEBUG nova.network.neutron [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:30:37 np0005588919 nova_compute[225855]: 2026-01-20 15:30:37.535 225859 DEBUG oslo_concurrency.lockutils [req-9d25b58a-3146-46dc-86ed-535e078a311e req-27928c24-da8a-4dbe-9574-63e5daba4373 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:30:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:38.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:38.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:38 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:30:38 np0005588919 nova_compute[225855]: 2026-01-20 15:30:38.980 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:40 np0005588919 podman[320193]: 2026-01-20 15:30:40.041040777 +0000 UTC m=+0.082785895 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:30:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:40.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:40.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:41 np0005588919 nova_compute[225855]: 2026-01-20 15:30:41.780 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:41Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:f9:3a 10.100.0.11
Jan 20 10:30:41 np0005588919 ovn_controller[130490]: 2026-01-20T15:30:41Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:f9:3a 10.100.0.11
Jan 20 10:30:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:42.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:42.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:43 np0005588919 nova_compute[225855]: 2026-01-20 15:30:43.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:44.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:44.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.736922) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044737043, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 1070, "num_deletes": 251, "total_data_size": 2334552, "memory_usage": 2374080, "flush_reason": "Manual Compaction"}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044764217, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 1521192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 78262, "largest_seqno": 79327, "table_properties": {"data_size": 1516323, "index_size": 2456, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10746, "raw_average_key_size": 20, "raw_value_size": 1506527, "raw_average_value_size": 2805, "num_data_blocks": 107, "num_entries": 537, "num_filter_entries": 537, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768922964, "oldest_key_time": 1768922964, "file_creation_time": 1768923044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 27344 microseconds, and 6707 cpu microseconds.
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.764281) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 1521192 bytes OK
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.764302) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765808) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765826) EVENT_LOG_v1 {"time_micros": 1768923044765820, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.765849) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 2329283, prev total WAL file size 2329283, number of live WAL files 2.
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.766641) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(1485KB)], [159(11MB)]
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044766702, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13201598, "oldest_snapshot_seqno": -1}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 10048 keys, 11281013 bytes, temperature: kUnknown
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044894169, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 11281013, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11218443, "index_size": 36345, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25157, "raw_key_size": 264870, "raw_average_key_size": 26, "raw_value_size": 11044649, "raw_average_value_size": 1099, "num_data_blocks": 1377, "num_entries": 10048, "num_filter_entries": 10048, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.894472) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 11281013 bytes
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.911669) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.5 rd, 88.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.1 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(16.1) write-amplify(7.4) OK, records in: 10567, records dropped: 519 output_compression: NoCompression
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.911717) EVENT_LOG_v1 {"time_micros": 1768923044911701, "job": 102, "event": "compaction_finished", "compaction_time_micros": 127575, "compaction_time_cpu_micros": 67581, "output_level": 6, "num_output_files": 1, "total_output_size": 11281013, "num_input_records": 10567, "num_output_records": 10048, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044912229, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923044914437, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.766535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.914595) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.914602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.914605) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.914606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:30:44.914608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:30:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:46.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:46.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:46 np0005588919 nova_compute[225855]: 2026-01-20 15:30:46.782 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:48.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:48.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:48 np0005588919 nova_compute[225855]: 2026-01-20 15:30:48.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:49 np0005588919 nova_compute[225855]: 2026-01-20 15:30:49.062 225859 INFO nova.compute.manager [None req-b7c0f359-6fea-4f3c-b9a6-227e347dbb63 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Get console output#033[00m
Jan 20 10:30:49 np0005588919 nova_compute[225855]: 2026-01-20 15:30:49.068 263775 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 20 10:30:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:50.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:30:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:50.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.869 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:50.870 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:30:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:50.871 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.908 225859 DEBUG nova.compute.manager [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-changed-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.908 225859 DEBUG nova.compute.manager [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing instance network info cache due to event network-changed-60202d18-26b2-493b-a427-211cda112a80. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.908 225859 DEBUG oslo_concurrency.lockutils [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.908 225859 DEBUG oslo_concurrency.lockutils [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:30:50 np0005588919 nova_compute[225855]: 2026-01-20 15:30:50.909 225859 DEBUG nova.network.neutron [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Refreshing network info cache for port 60202d18-26b2-493b-a427-211cda112a80 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:30:51 np0005588919 podman[320274]: 2026-01-20 15:30:51.002671385 +0000 UTC m=+0.045250053 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 20 10:30:51 np0005588919 nova_compute[225855]: 2026-01-20 15:30:51.784 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:52.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:52 np0005588919 nova_compute[225855]: 2026-01-20 15:30:52.579 225859 DEBUG nova.network.neutron [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updated VIF entry in instance network info cache for port 60202d18-26b2-493b-a427-211cda112a80. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:30:52 np0005588919 nova_compute[225855]: 2026-01-20 15:30:52.579 225859 DEBUG nova.network.neutron [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:30:52 np0005588919 nova_compute[225855]: 2026-01-20 15:30:52.603 225859 DEBUG oslo_concurrency.lockutils [req-4cfb5cd0-6dd6-4e37-a494-d2a93c433147 req-b28bd661-8e4c-422b-88b3-3d6303a03183 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:30:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:52.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:53 np0005588919 nova_compute[225855]: 2026-01-20 15:30:53.991 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:54.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:54.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:56.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:56.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:56 np0005588919 nova_compute[225855]: 2026-01-20 15:30:56.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:30:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:30:57.873 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:30:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:30:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:30:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:30:58.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:30:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:30:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:30:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:30:58.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:30:58 np0005588919 nova_compute[225855]: 2026-01-20 15:30:58.992 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:00.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:00.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:01 np0005588919 nova_compute[225855]: 2026-01-20 15:31:01.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:02.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:02.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:03 np0005588919 nova_compute[225855]: 2026-01-20 15:31:03.994 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:04.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:04.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:06.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:06.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:06 np0005588919 nova_compute[225855]: 2026-01-20 15:31:06.791 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:08.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:08.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:08 np0005588919 nova_compute[225855]: 2026-01-20 15:31:08.996 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:10 np0005588919 nova_compute[225855]: 2026-01-20 15:31:10.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:10 np0005588919 nova_compute[225855]: 2026-01-20 15:31:10.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:31:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:10.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:10.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:11 np0005588919 podman[320354]: 2026-01-20 15:31:11.090431942 +0000 UTC m=+0.105734320 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:31:11 np0005588919 nova_compute[225855]: 2026-01-20 15:31:11.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.369 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.369 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:31:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:12.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:12.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/46401669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.809 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.885 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ce as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:31:12 np0005588919 nova_compute[225855]: 2026-01-20 15:31:12.886 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000ce as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.023 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.024 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4034MB free_disk=20.921817779541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.025 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.025 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.103 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance c2074d47-58a3-49e8-82fd-6bc6145a1ea7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.103 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.104 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.155 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:31:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3338679589' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.589 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.596 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.611 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.633 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:31:13 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.634 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:13.999 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:14.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.634 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.634 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.635 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:31:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:14.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.822 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.823 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.823 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:31:14 np0005588919 nova_compute[225855]: 2026-01-20 15:31:14.824 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2074d47-58a3-49e8-82fd-6bc6145a1ea7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:31:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2264811836' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:16.454 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:16.454 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:16.455 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:16.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:16.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.795 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.971 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [{"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.984 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-c2074d47-58a3-49e8-82fd-6bc6145a1ea7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.984 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.985 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:16 np0005588919 nova_compute[225855]: 2026-01-20 15:31:16.985 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:17 np0005588919 nova_compute[225855]: 2026-01-20 15:31:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:17 np0005588919 nova_compute[225855]: 2026-01-20 15:31:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:17 np0005588919 nova_compute[225855]: 2026-01-20 15:31:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:18.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:18.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:19 np0005588919 nova_compute[225855]: 2026-01-20 15:31:19.001 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:20.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:20.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:21 np0005588919 nova_compute[225855]: 2026-01-20 15:31:21.797 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:22 np0005588919 podman[320432]: 2026-01-20 15:31:22.016796882 +0000 UTC m=+0.059242902 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:31:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:22.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:22.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:24 np0005588919 nova_compute[225855]: 2026-01-20 15:31:24.005 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:24 np0005588919 nova_compute[225855]: 2026-01-20 15:31:24.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:24.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:24.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:26.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:26.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:26 np0005588919 nova_compute[225855]: 2026-01-20 15:31:26.800 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:28.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:28.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:29 np0005588919 nova_compute[225855]: 2026-01-20 15:31:29.007 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:30.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:30.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:31 np0005588919 nova_compute[225855]: 2026-01-20 15:31:31.802 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:32.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:32.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.841 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.841 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.841 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.842 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.842 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.843 225859 INFO nova.compute.manager [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Terminating instance#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.844 225859 DEBUG nova.compute.manager [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:31:33 np0005588919 kernel: tap60202d18-26 (unregistering): left promiscuous mode
Jan 20 10:31:33 np0005588919 NetworkManager[49104]: <info>  [1768923093.9103] device (tap60202d18-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:31:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:31:33Z|00959|binding|INFO|Releasing lport 60202d18-26b2-493b-a427-211cda112a80 from this chassis (sb_readonly=0)
Jan 20 10:31:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:31:33Z|00960|binding|INFO|Setting lport 60202d18-26b2-493b-a427-211cda112a80 down in Southbound
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.917 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:33 np0005588919 ovn_controller[130490]: 2026-01-20T15:31:33Z|00961|binding|INFO|Removing iface tap60202d18-26 ovn-installed in OVS
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.920 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:33.933 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f9:3a 10.100.0.11'], port_security=['fa:16:3e:36:f9:3a 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c2074d47-58a3-49e8-82fd-6bc6145a1ea7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0228362f-0ced-4cac-bb89-96bd472df47f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6458a221-63f1-42cc-b15d-f9334e60cb66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=306abd7d-c001-4e00-b2a1-8a251fd6a022, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=60202d18-26b2-493b-a427-211cda112a80) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:31:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:33.935 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 60202d18-26b2-493b-a427-211cda112a80 in datapath 0228362f-0ced-4cac-bb89-96bd472df47f unbound from our chassis#033[00m
Jan 20 10:31:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:33.936 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0228362f-0ced-4cac-bb89-96bd472df47f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:31:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:33.938 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[51c1519d-13d9-4dda-b968-80b1f1a9981c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:33.938 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f namespace which is not needed anymore#033[00m
Jan 20 10:31:33 np0005588919 nova_compute[225855]: 2026-01-20 15:31:33.940 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:33 np0005588919 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d000000ce.scope: Deactivated successfully.
Jan 20 10:31:33 np0005588919 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d000000ce.scope: Consumed 15.218s CPU time.
Jan 20 10:31:33 np0005588919 systemd-machined[194361]: Machine qemu-110-instance-000000ce terminated.
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.008 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [NOTICE]   (320126) : haproxy version is 2.8.14-c23fe91
Jan 20 10:31:34 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [NOTICE]   (320126) : path to executable is /usr/sbin/haproxy
Jan 20 10:31:34 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [WARNING]  (320126) : Exiting Master process...
Jan 20 10:31:34 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [ALERT]    (320126) : Current worker (320128) exited with code 143 (Terminated)
Jan 20 10:31:34 np0005588919 neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f[320122]: [WARNING]  (320126) : All workers exited. Exiting... (0)
Jan 20 10:31:34 np0005588919 systemd[1]: libpod-220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43.scope: Deactivated successfully.
Jan 20 10:31:34 np0005588919 podman[320532]: 2026-01-20 15:31:34.05796173 +0000 UTC m=+0.042626288 container died 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.062 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.069 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.076 225859 INFO nova.virt.libvirt.driver [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Instance destroyed successfully.#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.076 225859 DEBUG nova.objects.instance [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid c2074d47-58a3-49e8-82fd-6bc6145a1ea7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:31:34 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43-userdata-shm.mount: Deactivated successfully.
Jan 20 10:31:34 np0005588919 systemd[1]: var-lib-containers-storage-overlay-5492aad4f22230a780618b9ba63a9c548a97b7ff7061366af2c207314fa31229-merged.mount: Deactivated successfully.
Jan 20 10:31:34 np0005588919 podman[320532]: 2026-01-20 15:31:34.096705376 +0000 UTC m=+0.081369924 container cleanup 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.099 225859 DEBUG nova.virt.libvirt.vif [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-761395495',display_name='tempest-TestNetworkBasicOps-server-761395495',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-761395495',id=206,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6PcyU5b6KJgECZvP75RVISUjV8spB81h3nAjsUONZi4KISBeJ3H+m9LFQCp72IhdPL4TNE6iitZI83oIzTSr0WLM1hF9NfU7ED77LiXjCqrZKn4HPslanwlp/Qjc+bCQ==',key_name='tempest-TestNetworkBasicOps-1783413232',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:30:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-mwqa18l0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:30:29Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=c2074d47-58a3-49e8-82fd-6bc6145a1ea7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.099 225859 DEBUG nova.network.os_vif_util [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "60202d18-26b2-493b-a427-211cda112a80", "address": "fa:16:3e:36:f9:3a", "network": {"id": "0228362f-0ced-4cac-bb89-96bd472df47f", "bridge": "br-int", "label": "tempest-network-smoke--197661202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60202d18-26", "ovs_interfaceid": "60202d18-26b2-493b-a427-211cda112a80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.100 225859 DEBUG nova.network.os_vif_util [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.101 225859 DEBUG os_vif [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.103 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.103 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60202d18-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:31:34 np0005588919 systemd[1]: libpod-conmon-220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43.scope: Deactivated successfully.
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.106 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.109 225859 INFO os_vif [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:f9:3a,bridge_name='br-int',has_traffic_filtering=True,id=60202d18-26b2-493b-a427-211cda112a80,network=Network(0228362f-0ced-4cac-bb89-96bd472df47f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60202d18-26')#033[00m
Jan 20 10:31:34 np0005588919 podman[320573]: 2026-01-20 15:31:34.1542926 +0000 UTC m=+0.036551824 container remove 220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.158 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[36a4dc8d-bbc2-4e25-b931-a4f043eafbd1]: (4, ('Tue Jan 20 03:31:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f (220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43)\n220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43\nTue Jan 20 03:31:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f (220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43)\n220fb88226bb4adc08a6d6e8c007dceb68fb2fe39546e4b37d2a422566ff7b43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.160 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0d80f30e-aa83-442a-af55-6d0ae3291714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.161 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0228362f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.162 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 kernel: tap0228362f-00: left promiscuous mode
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.174 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.176 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e67beb3-26a6-4229-9222-369609682e27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.188 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbfd4fd-c729-424c-a0a1-7fc5d42be9f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.189 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[98e0856f-ca3d-411a-895e-dc88459ccb57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.205 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[67abcb9b-ac55-4ad9-9ba5-9bf789037656]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 806792, 'reachable_time': 36443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 320606, 'error': None, 'target': 'ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 systemd[1]: run-netns-ovnmeta\x2d0228362f\x2d0ced\x2d4cac\x2dbb89\x2d96bd472df47f.mount: Deactivated successfully.
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.209 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0228362f-0ced-4cac-bb89-96bd472df47f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:31:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:34.209 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[08532f09-7406-4251-a509-7af40de05aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.470 225859 INFO nova.virt.libvirt.driver [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Deleting instance files /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7_del#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.471 225859 INFO nova.virt.libvirt.driver [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Deletion of /var/lib/nova/instances/c2074d47-58a3-49e8-82fd-6bc6145a1ea7_del complete#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.560 225859 INFO nova.compute.manager [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.561 225859 DEBUG oslo.service.loopingcall [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.561 225859 DEBUG nova.compute.manager [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.561 225859 DEBUG nova.network.neutron [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:31:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:34.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.643 225859 DEBUG nova.compute.manager [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-unplugged-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.644 225859 DEBUG oslo_concurrency.lockutils [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.644 225859 DEBUG oslo_concurrency.lockutils [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.644 225859 DEBUG oslo_concurrency.lockutils [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.645 225859 DEBUG nova.compute.manager [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] No waiting events found dispatching network-vif-unplugged-60202d18-26b2-493b-a427-211cda112a80 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:31:34 np0005588919 nova_compute[225855]: 2026-01-20 15:31:34.645 225859 DEBUG nova.compute.manager [req-6aa51df5-ffdf-436b-b095-c10f9d525a39 req-e33871ab-e95e-4e57-b70c-f4225524f920 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-unplugged-60202d18-26b2-493b-a427-211cda112a80 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:31:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:34.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.729 225859 DEBUG nova.network.neutron [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.765 225859 INFO nova.compute.manager [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Took 1.20 seconds to deallocate network for instance.#033[00m
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.830 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.831 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.848 225859 DEBUG nova.compute.manager [req-c4f4b203-3d85-4adc-9d31-425cc8af382c req-b1a0bc6e-518b-480a-a0fc-0f8e5633c5e0 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-deleted-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:31:35 np0005588919 nova_compute[225855]: 2026-01-20 15:31:35.888 225859 DEBUG oslo_concurrency.processutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:31:36 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:31:36 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/768340541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.330 225859 DEBUG oslo_concurrency.processutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.336 225859 DEBUG nova.compute.provider_tree [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.365 225859 DEBUG nova.scheduler.client.report [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.392 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.426 225859 INFO nova.scheduler.client.report [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance c2074d47-58a3-49e8-82fd-6bc6145a1ea7#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.528 225859 DEBUG oslo_concurrency.lockutils [None req-97a779a1-cb81-4dd3-846a-094c02b4513d 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:36.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:36.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.764 225859 DEBUG nova.compute.manager [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.765 225859 DEBUG oslo_concurrency.lockutils [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.765 225859 DEBUG oslo_concurrency.lockutils [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.765 225859 DEBUG oslo_concurrency.lockutils [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "c2074d47-58a3-49e8-82fd-6bc6145a1ea7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.765 225859 DEBUG nova.compute.manager [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] No waiting events found dispatching network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:31:36 np0005588919 nova_compute[225855]: 2026-01-20 15:31:36.765 225859 WARNING nova.compute.manager [req-7068a6eb-8ffe-49e5-950a-8c2a848a2bd4 req-0d9c7789-b765-47d3-bcba-62eae64c725e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Received unexpected event network-vif-plugged-60202d18-26b2-493b-a427-211cda112a80 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:31:37 np0005588919 nova_compute[225855]: 2026-01-20 15:31:37.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:31:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:38.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:38.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:39 np0005588919 nova_compute[225855]: 2026-01-20 15:31:39.010 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:39 np0005588919 nova_compute[225855]: 2026-01-20 15:31:39.104 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:31:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:31:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:31:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:40.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:40.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:40 np0005588919 nova_compute[225855]: 2026-01-20 15:31:40.938 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:41 np0005588919 nova_compute[225855]: 2026-01-20 15:31:41.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:41.687 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:31:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:41.688 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:31:41 np0005588919 nova_compute[225855]: 2026-01-20 15:31:41.746 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:42 np0005588919 podman[320769]: 2026-01-20 15:31:42.069942663 +0000 UTC m=+0.112416131 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:31:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:42.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:31:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:42.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:31:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:44 np0005588919 nova_compute[225855]: 2026-01-20 15:31:44.067 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:44 np0005588919 nova_compute[225855]: 2026-01-20 15:31:44.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:44.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:44.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:31:45.689 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:31:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:46.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:46 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:31:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:48.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:48.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:49 np0005588919 nova_compute[225855]: 2026-01-20 15:31:49.070 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:49 np0005588919 nova_compute[225855]: 2026-01-20 15:31:49.075 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923094.0744, c2074d47-58a3-49e8-82fd-6bc6145a1ea7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:31:49 np0005588919 nova_compute[225855]: 2026-01-20 15:31:49.076 225859 INFO nova.compute.manager [-] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:31:49 np0005588919 nova_compute[225855]: 2026-01-20 15:31:49.103 225859 DEBUG nova.compute.manager [None req-c44a54da-2c96-4712-b243-b3e332e10686 - - - - - -] [instance: c2074d47-58a3-49e8-82fd-6bc6145a1ea7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:31:49 np0005588919 nova_compute[225855]: 2026-01-20 15:31:49.108 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:50.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:31:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:31:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:52.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:52 np0005588919 podman[320900]: 2026-01-20 15:31:52.993737111 +0000 UTC m=+0.045500320 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:31:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:54 np0005588919 nova_compute[225855]: 2026-01-20 15:31:54.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:54 np0005588919 nova_compute[225855]: 2026-01-20 15:31:54.110 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:54.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:54.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:56.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:56.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:31:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:31:58.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:31:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:31:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:31:58.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:31:59 np0005588919 nova_compute[225855]: 2026-01-20 15:31:59.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:31:59 np0005588919 nova_compute[225855]: 2026-01-20 15:31:59.112 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:00.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:02.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:04 np0005588919 nova_compute[225855]: 2026-01-20 15:32:04.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:04 np0005588919 nova_compute[225855]: 2026-01-20 15:32:04.113 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:04.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:06.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:06.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:08.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:08.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.114 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.116 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.116 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.116 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.133 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:09 np0005588919 nova_compute[225855]: 2026-01-20 15:32:09.134 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 20 10:32:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:10.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:12 np0005588919 nova_compute[225855]: 2026-01-20 15:32:12.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:12 np0005588919 nova_compute[225855]: 2026-01-20 15:32:12.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:32:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:12.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:12.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:13 np0005588919 podman[320981]: 2026-01-20 15:32:13.036727712 +0000 UTC m=+0.081614281 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.368 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.368 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:32:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3001522942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.795 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.940 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.942 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4266MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.942 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:13 np0005588919 nova_compute[225855]: 2026-01-20 15:32:13.942 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.060 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.060 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.123 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.146 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.149 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:32:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:32:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3050195879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.563 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.568 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.595 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.618 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:32:14 np0005588919 nova_compute[225855]: 2026-01-20 15:32:14.618 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:14.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:14.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:16 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:16Z|00962|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 20 10:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:16.455 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:16.455 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:16.455 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:16 np0005588919 nova_compute[225855]: 2026-01-20 15:32:16.619 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:16 np0005588919 nova_compute[225855]: 2026-01-20 15:32:16.620 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:32:16 np0005588919 nova_compute[225855]: 2026-01-20 15:32:16.620 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:32:16 np0005588919 nova_compute[225855]: 2026-01-20 15:32:16.633 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:32:16 np0005588919 nova_compute[225855]: 2026-01-20 15:32:16.633 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:16.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:17 np0005588919 nova_compute[225855]: 2026-01-20 15:32:17.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:18 np0005588919 nova_compute[225855]: 2026-01-20 15:32:18.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:18 np0005588919 nova_compute[225855]: 2026-01-20 15:32:18.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:18.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:18.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:19 np0005588919 nova_compute[225855]: 2026-01-20 15:32:19.137 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:19 np0005588919 nova_compute[225855]: 2026-01-20 15:32:19.148 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:19 np0005588919 nova_compute[225855]: 2026-01-20 15:32:19.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:20.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:22.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:22.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:24 np0005588919 podman[321058]: 2026-01-20 15:32:24.004411673 +0000 UTC m=+0.053422736 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:32:24 np0005588919 nova_compute[225855]: 2026-01-20 15:32:24.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:24 np0005588919 nova_compute[225855]: 2026-01-20 15:32:24.149 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:24 np0005588919 nova_compute[225855]: 2026-01-20 15:32:24.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:32:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:24.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:24.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:26.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:28.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:29 np0005588919 nova_compute[225855]: 2026-01-20 15:32:29.141 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:29 np0005588919 nova_compute[225855]: 2026-01-20 15:32:29.151 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:30.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:32.510 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:32:32 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:32.511 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:32:32 np0005588919 nova_compute[225855]: 2026-01-20 15:32:32.510 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:32.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:32.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.314553) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153314611, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 1257, "num_deletes": 256, "total_data_size": 2756920, "memory_usage": 2790464, "flush_reason": "Manual Compaction"}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153331816, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1819587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79332, "largest_seqno": 80584, "table_properties": {"data_size": 1814121, "index_size": 2861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11617, "raw_average_key_size": 19, "raw_value_size": 1803194, "raw_average_value_size": 3025, "num_data_blocks": 127, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923045, "oldest_key_time": 1768923045, "file_creation_time": 1768923153, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 17344 microseconds, and 4721 cpu microseconds.
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.331889) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1819587 bytes OK
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.331914) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332902) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332916) EVENT_LOG_v1 {"time_micros": 1768923153332912, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.332937) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 2750937, prev total WAL file size 2750937, number of live WAL files 2.
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.333586) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303232' seq:72057594037927935, type:22 .. '6C6F676D0033323734' seq:0, type:0; will stop at (end)
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1776KB)], [162(10MB)]
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153333662, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13100600, "oldest_snapshot_seqno": -1}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 10119 keys, 12969330 bytes, temperature: kUnknown
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153484960, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 12969330, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12904237, "index_size": 38676, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 267308, "raw_average_key_size": 26, "raw_value_size": 12727179, "raw_average_value_size": 1257, "num_data_blocks": 1473, "num_entries": 10119, "num_filter_entries": 10119, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923153, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.485624) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 12969330 bytes
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487146) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 86.5 rd, 85.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.8 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.3) write-amplify(7.1) OK, records in: 10644, records dropped: 525 output_compression: NoCompression
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.487180) EVENT_LOG_v1 {"time_micros": 1768923153487167, "job": 104, "event": "compaction_finished", "compaction_time_micros": 151418, "compaction_time_cpu_micros": 32496, "output_level": 6, "num_output_files": 1, "total_output_size": 12969330, "num_input_records": 10644, "num_output_records": 10119, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153487623, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923153489520, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.333504) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.489719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.489728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.489730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.489732) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:32:33.489735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:32:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:34 np0005588919 nova_compute[225855]: 2026-01-20 15:32:34.143 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:34 np0005588919 nova_compute[225855]: 2026-01-20 15:32:34.152 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:34.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:34.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:36.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:37 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:37.512 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:38.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:39 np0005588919 nova_compute[225855]: 2026-01-20 15:32:39.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:39 np0005588919 nova_compute[225855]: 2026-01-20 15:32:39.152 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:40.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:40.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:42.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:42.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:44 np0005588919 podman[321138]: 2026-01-20 15:32:44.034650752 +0000 UTC m=+0.079469570 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:32:44 np0005588919 nova_compute[225855]: 2026-01-20 15:32:44.146 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:44 np0005588919 nova_compute[225855]: 2026-01-20 15:32:44.153 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:44.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:46.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:47 np0005588919 podman[321336]: 2026-01-20 15:32:47.187789278 +0000 UTC m=+0.070241707 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 10:32:47 np0005588919 podman[321336]: 2026-01-20 15:32:47.326298733 +0000 UTC m=+0.208751122 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.407 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.409 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.428 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.512 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.513 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.537 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.537 225859 INFO nova.compute.claims [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:32:47 np0005588919 nova_compute[225855]: 2026-01-20 15:32:47.690 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:47 np0005588919 podman[321496]: 2026-01-20 15:32:47.859586738 +0000 UTC m=+0.052781228 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:32:47 np0005588919 podman[321496]: 2026-01-20 15:32:47.869347316 +0000 UTC m=+0.062541796 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:32:48 np0005588919 podman[321580]: 2026-01-20 15:32:48.074320149 +0000 UTC m=+0.063443763 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=keepalived-container, name=keepalived, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, release=1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, version=2.2.4, description=keepalived for Ceph, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 20 10:32:48 np0005588919 podman[321580]: 2026-01-20 15:32:48.08733003 +0000 UTC m=+0.076453664 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, architecture=x86_64, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, name=keepalived, vcs-type=git, vendor=Red Hat, Inc.)
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1085350766' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.114 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.120 225859 DEBUG nova.compute.provider_tree [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.152 225859 DEBUG nova.scheduler.client.report [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.191 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.192 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.246 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.247 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.287 225859 INFO nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.314 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.396 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.397 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.398 225859 INFO nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Creating image(s)#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.427 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.457 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.483 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.487 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.552 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.553 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.554 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.554 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.582 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.586 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.613 225859 DEBUG nova.policy [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:32:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.866 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:48 np0005588919 nova_compute[225855]: 2026-01-20 15:32:48.940 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.047 225859 DEBUG nova.objects.instance [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 054e01d8-c9d1-4fb3-99e1-d417718d48c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.064 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.065 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Ensure instance console log exists: /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.066 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.066 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.067 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.148 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.154 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:49 np0005588919 nova_compute[225855]: 2026-01-20 15:32:49.737 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Successfully created port: 71bbd457-6ff9-4170-b4f0-18fb471606d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:32:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:32:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.664 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Successfully updated port: 71bbd457-6ff9-4170-b4f0-18fb471606d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.681 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.681 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.681 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:32:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:50.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.761 225859 DEBUG nova.compute.manager [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-changed-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.761 225859 DEBUG nova.compute.manager [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Refreshing instance network info cache due to event network-changed-71bbd457-6ff9-4170-b4f0-18fb471606d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.761 225859 DEBUG oslo_concurrency.lockutils [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:32:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:50.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:50 np0005588919 nova_compute[225855]: 2026-01-20 15:32:50.820 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.746 225859 DEBUG nova.network.neutron [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updating instance_info_cache with network_info: [{"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.769 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.770 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Instance network_info: |[{"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.771 225859 DEBUG oslo_concurrency.lockutils [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.772 225859 DEBUG nova.network.neutron [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Refreshing network info cache for port 71bbd457-6ff9-4170-b4f0-18fb471606d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.777 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Start _get_guest_xml network_info=[{"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.784 225859 WARNING nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.791 225859 DEBUG nova.virt.libvirt.host [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.792 225859 DEBUG nova.virt.libvirt.host [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.802 225859 DEBUG nova.virt.libvirt.host [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.803 225859 DEBUG nova.virt.libvirt.host [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.805 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.805 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.806 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.807 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.807 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.808 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.808 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.809 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.809 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.810 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.811 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.811 225859 DEBUG nova.virt.hardware [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:32:51 np0005588919 nova_compute[225855]: 2026-01-20 15:32:51.816 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:32:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/62047383' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.279 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.304 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.308 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:52 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:32:52 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1025644885' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.725 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.727 225859 DEBUG nova.virt.libvirt.vif [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975990603',display_name='tempest-TestNetworkBasicOps-server-1975990603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975990603',id=209,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIIq5p1Z8aKbdSJMUPSMnjWUaTZorIMa+mXmK10gXmX/oHg+Z5q1Rmf+/0TauJDUZqczNGvwDzE8yxRK1lxgnRI2fdz8rl+BuPz+yhlF83YWDX8Jzvo5YEkj80ZkenoXA==',key_name='tempest-TestNetworkBasicOps-1869687346',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ox7roysq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:32:48Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=054e01d8-c9d1-4fb3-99e1-d417718d48c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.727 225859 DEBUG nova.network.os_vif_util [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.728 225859 DEBUG nova.network.os_vif_util [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.729 225859 DEBUG nova.objects.instance [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 054e01d8-c9d1-4fb3-99e1-d417718d48c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:32:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:52.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.745 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <uuid>054e01d8-c9d1-4fb3-99e1-d417718d48c9</uuid>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <name>instance-000000d1</name>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-1975990603</nova:name>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:32:51</nova:creationTime>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <nova:port uuid="71bbd457-6ff9-4170-b4f0-18fb471606d4">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="serial">054e01d8-c9d1-4fb3-99e1-d417718d48c9</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="uuid">054e01d8-c9d1-4fb3-99e1-d417718d48c9</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:65:ea:56"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <target dev="tap71bbd457-6f"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/console.log" append="off"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:32:52 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:32:52 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:32:52 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:32:52 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.747 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Preparing to wait for external event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.747 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.747 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.747 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.748 225859 DEBUG nova.virt.libvirt.vif [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975990603',display_name='tempest-TestNetworkBasicOps-server-1975990603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975990603',id=209,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIIq5p1Z8aKbdSJMUPSMnjWUaTZorIMa+mXmK10gXmX/oHg+Z5q1Rmf+/0TauJDUZqczNGvwDzE8yxRK1lxgnRI2fdz8rl+BuPz+yhlF83YWDX8Jzvo5YEkj80ZkenoXA==',key_name='tempest-TestNetworkBasicOps-1869687346',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ox7roysq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:32:48Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=054e01d8-c9d1-4fb3-99e1-d417718d48c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.748 225859 DEBUG nova.network.os_vif_util [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.749 225859 DEBUG nova.network.os_vif_util [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.749 225859 DEBUG os_vif [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.750 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.751 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.751 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.754 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.754 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71bbd457-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.755 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71bbd457-6f, col_values=(('external_ids', {'iface-id': '71bbd457-6ff9-4170-b4f0-18fb471606d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:ea:56', 'vm-uuid': '054e01d8-c9d1-4fb3-99e1-d417718d48c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.758 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:32:52 np0005588919 NetworkManager[49104]: <info>  [1768923172.7582] manager: (tap71bbd457-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/410)
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.762 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.764 225859 INFO os_vif [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f')#033[00m
Jan 20 10:32:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:52.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.808 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.808 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.809 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:65:ea:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.809 225859 INFO nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Using config drive#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.831 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.961 225859 DEBUG nova.network.neutron [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updated VIF entry in instance network info cache for port 71bbd457-6ff9-4170-b4f0-18fb471606d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.962 225859 DEBUG nova.network.neutron [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updating instance_info_cache with network_info: [{"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:32:52 np0005588919 nova_compute[225855]: 2026-01-20 15:32:52.994 225859 DEBUG oslo_concurrency.lockutils [req-8ddf2c85-37d3-46ae-be05-39ce818013fc req-83ace2c5-1087-4649-b9fa-54d0ca7cfcea 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.232 225859 INFO nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Creating config drive at /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.240 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphke05hdm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.375 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphke05hdm" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.401 225859 DEBUG nova.storage.rbd_utils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.405 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:32:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.549 225859 DEBUG oslo_concurrency.processutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config 054e01d8-c9d1-4fb3-99e1-d417718d48c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.550 225859 INFO nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Deleting local config drive /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9/disk.config because it was imported into RBD.#033[00m
Jan 20 10:32:53 np0005588919 kernel: tap71bbd457-6f: entered promiscuous mode
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.6015] manager: (tap71bbd457-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/411)
Jan 20 10:32:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:53Z|00963|binding|INFO|Claiming lport 71bbd457-6ff9-4170-b4f0-18fb471606d4 for this chassis.
Jan 20 10:32:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:53Z|00964|binding|INFO|71bbd457-6ff9-4170-b4f0-18fb471606d4: Claiming fa:16:3e:65:ea:56 10.100.0.20
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.651 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.657 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ea:56 10.100.0.20'], port_security=['fa:16:3e:65:ea:56 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '054e01d8-c9d1-4fb3-99e1-d417718d48c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1610a22-2f29-4495-85e7-ab2081f73701', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '436420ae-5ad2-462b-90ca-5a96acbe39fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=733f71a0-4d98-4c07-b692-f20cf2a632ed, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=71bbd457-6ff9-4170-b4f0-18fb471606d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.658 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 71bbd457-6ff9-4170-b4f0-18fb471606d4 in datapath e1610a22-2f29-4495-85e7-ab2081f73701 bound to our chassis#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.659 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e1610a22-2f29-4495-85e7-ab2081f73701#033[00m
Jan 20 10:32:53 np0005588919 systemd-udevd[322098]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.670 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d9381682-eb95-4fde-bebd-847671125bd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.672 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape1610a22-21 in ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:32:53 np0005588919 systemd-machined[194361]: New machine qemu-111-instance-000000d1.
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.673 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape1610a22-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.673 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cfefeb7c-df45-42ba-800a-fd678ae3e1ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.674 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d2cf90d8-5e56-4d7d-9f64-0de3777f3530]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.6808] device (tap71bbd457-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.6814] device (tap71bbd457-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:32:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:53Z|00965|binding|INFO|Setting lport 71bbd457-6ff9-4170-b4f0-18fb471606d4 ovn-installed in OVS
Jan 20 10:32:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:53Z|00966|binding|INFO|Setting lport 71bbd457-6ff9-4170-b4f0-18fb471606d4 up in Southbound
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.687 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[6f35cfb8-b8f4-43ae-9a58-2f072325c39d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.688 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 systemd[1]: Started Virtual Machine qemu-111-instance-000000d1.
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.701 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c1c04e-ec40-4656-81f0-6b6c6616b294]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.726 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4a4b69-6df7-4d93-bf30-d9ada83b5767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.731 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f8bb43-39ff-45d8-af2b-41fe705d20e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.7339] manager: (tape1610a22-20): new Veth device (/org/freedesktop/NetworkManager/Devices/412)
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.763 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[42b0b8c1-da54-4c8a-8fb1-372af8de7f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.766 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[111d5739-b32c-4c58-8551-5d999fdfe785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.7864] device (tape1610a22-20): carrier: link connected
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.792 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[ba136932-5c24-461c-9199-fe74d0e116cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.812 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c87b00a6-038d-4279-a316-5a55339ac340]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1610a22-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:09:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 274], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 821271, 'reachable_time': 16693, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322132, 'error': None, 'target': 'ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.825 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[febe7bf7-59d9-44f5-90ec-4889aeca7298]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:961'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 821271, 'tstamp': 821271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 322133, 'error': None, 'target': 'ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.841 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b8763b90-c8cc-4330-bcef-326da40e3d33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1610a22-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:09:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 274], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 821271, 'reachable_time': 16693, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 322134, 'error': None, 'target': 'ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.876 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[647ff6a8-5c39-456d-a99b-6ae1c626fa96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.927 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[71b1efb6-970b-4e02-bfea-208ce73e9903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.929 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1610a22-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.929 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.930 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1610a22-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.931 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 kernel: tape1610a22-20: entered promiscuous mode
Jan 20 10:32:53 np0005588919 NetworkManager[49104]: <info>  [1768923173.9325] manager: (tape1610a22-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.934 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.939 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape1610a22-20, col_values=(('external_ids', {'iface-id': '6d7499a4-3049-4825-9ec9-301fdceff3a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.940 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:32:53Z|00967|binding|INFO|Releasing lport 6d7499a4-3049-4825-9ec9-301fdceff3a8 from this chassis (sb_readonly=0)
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.942 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e1610a22-2f29-4495-85e7-ab2081f73701.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e1610a22-2f29-4495-85e7-ab2081f73701.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.943 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cec34c11-32b6-44f9-9c5f-935fe56ed34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.943 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-e1610a22-2f29-4495-85e7-ab2081f73701
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/e1610a22-2f29-4495-85e7-ab2081f73701.pid.haproxy
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID e1610a22-2f29-4495-85e7-ab2081f73701
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:32:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:32:53.944 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701', 'env', 'PROCESS_TAG=haproxy-e1610a22-2f29-4495-85e7-ab2081f73701', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e1610a22-2f29-4495-85e7-ab2081f73701.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:32:53 np0005588919 nova_compute[225855]: 2026-01-20 15:32:53.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.035 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923174.034733, 054e01d8-c9d1-4fb3-99e1-d417718d48c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.035 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] VM Started (Lifecycle Event)#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.062 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.065 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923174.0348258, 054e01d8-c9d1-4fb3-99e1-d417718d48c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.066 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.096 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.100 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.129 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.150 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:54 np0005588919 podman[322208]: 2026-01-20 15:32:54.299724672 +0000 UTC m=+0.066069868 container create 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:32:54 np0005588919 systemd[1]: Started libpod-conmon-137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162.scope.
Jan 20 10:32:54 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:32:54 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24838984a067b8901522e09bee329bb5043ee4222b2b8328f8d5882b1349f3b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:32:54 np0005588919 podman[322208]: 2026-01-20 15:32:54.274365908 +0000 UTC m=+0.040711124 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:32:54 np0005588919 podman[322208]: 2026-01-20 15:32:54.378317046 +0000 UTC m=+0.144662272 container init 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:32:54 np0005588919 podman[322208]: 2026-01-20 15:32:54.384998386 +0000 UTC m=+0.151343582 container start 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:32:54 np0005588919 podman[322219]: 2026-01-20 15:32:54.396052142 +0000 UTC m=+0.058496001 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 20 10:32:54 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [NOTICE]   (322243) : New worker (322247) forked
Jan 20 10:32:54 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [NOTICE]   (322243) : Loading success.
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.703 225859 DEBUG nova.compute.manager [req-3f493aaf-4bab-4659-8e93-c1b3d88511a0 req-9556ff14-6f9b-4d92-a9e2-aa06810e326e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.703 225859 DEBUG oslo_concurrency.lockutils [req-3f493aaf-4bab-4659-8e93-c1b3d88511a0 req-9556ff14-6f9b-4d92-a9e2-aa06810e326e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.704 225859 DEBUG oslo_concurrency.lockutils [req-3f493aaf-4bab-4659-8e93-c1b3d88511a0 req-9556ff14-6f9b-4d92-a9e2-aa06810e326e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.704 225859 DEBUG oslo_concurrency.lockutils [req-3f493aaf-4bab-4659-8e93-c1b3d88511a0 req-9556ff14-6f9b-4d92-a9e2-aa06810e326e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.704 225859 DEBUG nova.compute.manager [req-3f493aaf-4bab-4659-8e93-c1b3d88511a0 req-9556ff14-6f9b-4d92-a9e2-aa06810e326e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Processing event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.705 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.710 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923174.710074, 054e01d8-c9d1-4fb3-99e1-d417718d48c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.711 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.712 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.717 225859 INFO nova.virt.libvirt.driver [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Instance spawned successfully.#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.718 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:32:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:32:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:54.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.739 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.743 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.750 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.751 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.751 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.752 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.752 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.752 225859 DEBUG nova.virt.libvirt.driver [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.759 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:32:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:54.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.798 225859 INFO nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Took 6.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.799 225859 DEBUG nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.879 225859 INFO nova.compute.manager [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Took 7.40 seconds to build instance.#033[00m
Jan 20 10:32:54 np0005588919 nova_compute[225855]: 2026-01-20 15:32:54.896 225859 DEBUG oslo_concurrency.lockutils [None req-f04d4685-41d2-430c-9a80-84505e728853 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:32:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:56.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:32:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:56.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.856 225859 DEBUG nova.compute.manager [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.857 225859 DEBUG oslo_concurrency.lockutils [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.857 225859 DEBUG oslo_concurrency.lockutils [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.858 225859 DEBUG oslo_concurrency.lockutils [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.858 225859 DEBUG nova.compute.manager [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] No waiting events found dispatching network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:32:56 np0005588919 nova_compute[225855]: 2026-01-20 15:32:56.858 225859 WARNING nova.compute.manager [req-44bbdca5-9374-48fe-98dc-127b6934573c req-a6559dbc-75e8-4408-82f1-ef7bf1829420 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received unexpected event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:32:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:32:57 np0005588919 nova_compute[225855]: 2026-01-20 15:32:57.757 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:32:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:32:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:32:58.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:32:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:32:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:32:58.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:32:59 np0005588919 nova_compute[225855]: 2026-01-20 15:32:59.153 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:00.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:00.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:02.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:02 np0005588919 nova_compute[225855]: 2026-01-20 15:33:02.761 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:02.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:04 np0005588919 nova_compute[225855]: 2026-01-20 15:33:04.155 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:04.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:04.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:06.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:06.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:07 np0005588919 nova_compute[225855]: 2026-01-20 15:33:07.764 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:33:08Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:ea:56 10.100.0.20
Jan 20 10:33:08 np0005588919 ovn_controller[130490]: 2026-01-20T15:33:08Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:ea:56 10.100.0.20
Jan 20 10:33:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:08.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:08.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:09 np0005588919 nova_compute[225855]: 2026-01-20 15:33:09.156 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:10.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:10.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:12 np0005588919 nova_compute[225855]: 2026-01-20 15:33:12.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:12 np0005588919 nova_compute[225855]: 2026-01-20 15:33:12.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:33:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:12.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:12 np0005588919 nova_compute[225855]: 2026-01-20 15:33:12.767 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:12.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:14 np0005588919 nova_compute[225855]: 2026-01-20 15:33:14.159 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:14.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:15 np0005588919 podman[322370]: 2026-01-20 15:33:15.042602513 +0000 UTC m=+0.077405061 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 20 10:33:15 np0005588919 nova_compute[225855]: 2026-01-20 15:33:15.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:15 np0005588919 nova_compute[225855]: 2026-01-20 15:33:15.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:33:15 np0005588919 nova_compute[225855]: 2026-01-20 15:33:15.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:16.456 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:16 np0005588919 nova_compute[225855]: 2026-01-20 15:33:16.549 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:33:16 np0005588919 nova_compute[225855]: 2026-01-20 15:33:16.550 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:33:16 np0005588919 nova_compute[225855]: 2026-01-20 15:33:16.550 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:33:16 np0005588919 nova_compute[225855]: 2026-01-20 15:33:16.550 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 054e01d8-c9d1-4fb3-99e1-d417718d48c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:33:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:16.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:16.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:17 np0005588919 nova_compute[225855]: 2026-01-20 15:33:17.769 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:18.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:18.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.161 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.537 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updating instance_info_cache with network_info: [{"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.553 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-054e01d8-c9d1-4fb3-99e1-d417718d48c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.554 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.554 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.555 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.555 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.555 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.555 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.575 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.576 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.576 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.576 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:33:19 np0005588919 nova_compute[225855]: 2026-01-20 15:33:19.577 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:33:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:33:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1180419469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.048 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.119 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.119 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.264 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.266 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4052MB free_disk=20.897071838378906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.266 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.267 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.367 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 054e01d8-c9d1-4fb3-99e1-d417718d48c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.368 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.368 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.419 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:33:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:20.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:20.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:33:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3427014143' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.864 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.870 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.883 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.905 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:33:20 np0005588919 nova_compute[225855]: 2026-01-20 15:33:20.906 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:21 np0005588919 nova_compute[225855]: 2026-01-20 15:33:21.691 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.558 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.558 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.559 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.559 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.559 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.560 225859 INFO nova.compute.manager [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Terminating instance#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.561 225859 DEBUG nova.compute.manager [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:33:22 np0005588919 kernel: tap71bbd457-6f (unregistering): left promiscuous mode
Jan 20 10:33:22 np0005588919 NetworkManager[49104]: <info>  [1768923202.6144] device (tap71bbd457-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:33:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:33:22Z|00968|binding|INFO|Releasing lport 71bbd457-6ff9-4170-b4f0-18fb471606d4 from this chassis (sb_readonly=0)
Jan 20 10:33:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:33:22Z|00969|binding|INFO|Setting lport 71bbd457-6ff9-4170-b4f0-18fb471606d4 down in Southbound
Jan 20 10:33:22 np0005588919 ovn_controller[130490]: 2026-01-20T15:33:22Z|00970|binding|INFO|Removing iface tap71bbd457-6f ovn-installed in OVS
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.621 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.628 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:ea:56 10.100.0.20'], port_security=['fa:16:3e:65:ea:56 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '054e01d8-c9d1-4fb3-99e1-d417718d48c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1610a22-2f29-4495-85e7-ab2081f73701', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '436420ae-5ad2-462b-90ca-5a96acbe39fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=733f71a0-4d98-4c07-b692-f20cf2a632ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=71bbd457-6ff9-4170-b4f0-18fb471606d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.629 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 71bbd457-6ff9-4170-b4f0-18fb471606d4 in datapath e1610a22-2f29-4495-85e7-ab2081f73701 unbound from our chassis#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.630 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e1610a22-2f29-4495-85e7-ab2081f73701, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.631 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[883d867b-9b80-4fac-a1ee-f769419dc773]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.632 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701 namespace which is not needed anymore#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.643 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d000000d1.scope: Deactivated successfully.
Jan 20 10:33:22 np0005588919 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d000000d1.scope: Consumed 13.789s CPU time.
Jan 20 10:33:22 np0005588919 systemd-machined[194361]: Machine qemu-111-instance-000000d1 terminated.
Jan 20 10:33:22 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [NOTICE]   (322243) : haproxy version is 2.8.14-c23fe91
Jan 20 10:33:22 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [NOTICE]   (322243) : path to executable is /usr/sbin/haproxy
Jan 20 10:33:22 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [WARNING]  (322243) : Exiting Master process...
Jan 20 10:33:22 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [ALERT]    (322243) : Current worker (322247) exited with code 143 (Terminated)
Jan 20 10:33:22 np0005588919 neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701[322222]: [WARNING]  (322243) : All workers exited. Exiting... (0)
Jan 20 10:33:22 np0005588919 systemd[1]: libpod-137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162.scope: Deactivated successfully.
Jan 20 10:33:22 np0005588919 podman[322471]: 2026-01-20 15:33:22.76197491 +0000 UTC m=+0.047137297 container died 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.770 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.780 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:22.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.785 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162-userdata-shm.mount: Deactivated successfully.
Jan 20 10:33:22 np0005588919 systemd[1]: var-lib-containers-storage-overlay-c24838984a067b8901522e09bee329bb5043ee4222b2b8328f8d5882b1349f3b-merged.mount: Deactivated successfully.
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.795 225859 INFO nova.virt.libvirt.driver [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Instance destroyed successfully.#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.796 225859 DEBUG nova.objects.instance [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 054e01d8-c9d1-4fb3-99e1-d417718d48c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:33:22 np0005588919 podman[322471]: 2026-01-20 15:33:22.800677585 +0000 UTC m=+0.085839962 container cleanup 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.811 225859 DEBUG nova.virt.libvirt.vif [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:32:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1975990603',display_name='tempest-TestNetworkBasicOps-server-1975990603',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1975990603',id=209,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIIq5p1Z8aKbdSJMUPSMnjWUaTZorIMa+mXmK10gXmX/oHg+Z5q1Rmf+/0TauJDUZqczNGvwDzE8yxRK1lxgnRI2fdz8rl+BuPz+yhlF83YWDX8Jzvo5YEkj80ZkenoXA==',key_name='tempest-TestNetworkBasicOps-1869687346',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:32:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-ox7roysq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:32:54Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=054e01d8-c9d1-4fb3-99e1-d417718d48c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.811 225859 DEBUG nova.network.os_vif_util [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "address": "fa:16:3e:65:ea:56", "network": {"id": "e1610a22-2f29-4495-85e7-ab2081f73701", "bridge": "br-int", "label": "tempest-network-smoke--313389391", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71bbd457-6f", "ovs_interfaceid": "71bbd457-6ff9-4170-b4f0-18fb471606d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.812 225859 DEBUG nova.network.os_vif_util [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.812 225859 DEBUG os_vif [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.814 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.815 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71bbd457-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.817 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 systemd[1]: libpod-conmon-137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162.scope: Deactivated successfully.
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.821 225859 INFO os_vif [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:ea:56,bridge_name='br-int',has_traffic_filtering=True,id=71bbd457-6ff9-4170-b4f0-18fb471606d4,network=Network(e1610a22-2f29-4495-85e7-ab2081f73701),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71bbd457-6f')#033[00m
Jan 20 10:33:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:22.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:22 np0005588919 podman[322509]: 2026-01-20 15:33:22.866274248 +0000 UTC m=+0.044298326 container remove 137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.871 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[aeea1e19-56c6-4dc2-ab27-1dbf1093693d]: (4, ('Tue Jan 20 03:33:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701 (137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162)\n137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162\nTue Jan 20 03:33:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701 (137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162)\n137c6f3d853ad68f70cddfd1e2ee96838bb27d94b52a9a2af830bfd96ffb6162\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.872 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f6270262-9c6d-459b-b7f2-120fd0deede4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.874 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1610a22-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 kernel: tape1610a22-20: left promiscuous mode
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.878 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.880 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ed939647-7ca1-4bf2-ad76-9284ca861e37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 nova_compute[225855]: 2026-01-20 15:33:22.891 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.898 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[305ffadd-d82f-4901-bfd4-e8e6487759fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.900 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0562e063-6067-4aab-9f22-2bbe0f7f2efe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.913 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[73358e28-b6ea-40f2-83f9-f3367f5830dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 821265, 'reachable_time': 40240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322542, 'error': None, 'target': 'ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:22 np0005588919 systemd[1]: run-netns-ovnmeta\x2de1610a22\x2d2f29\x2d4495\x2d85e7\x2dab2081f73701.mount: Deactivated successfully.
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.917 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e1610a22-2f29-4495-85e7-ab2081f73701 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:33:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:22.917 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[6820b3f4-9b89-41c4-b2c7-67822c7dfda1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.067 225859 DEBUG nova.compute.manager [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-unplugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.068 225859 DEBUG oslo_concurrency.lockutils [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.068 225859 DEBUG oslo_concurrency.lockutils [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.068 225859 DEBUG oslo_concurrency.lockutils [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.069 225859 DEBUG nova.compute.manager [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] No waiting events found dispatching network-vif-unplugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.069 225859 DEBUG nova.compute.manager [req-823a2485-fb95-4e01-9de9-ffbf725eec05 req-23f1469c-47e8-4be0-8e41-31082ec38813 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-unplugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.155 225859 INFO nova.virt.libvirt.driver [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Deleting instance files /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9_del#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.156 225859 INFO nova.virt.libvirt.driver [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Deletion of /var/lib/nova/instances/054e01d8-c9d1-4fb3-99e1-d417718d48c9_del complete#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.222 225859 INFO nova.compute.manager [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Took 0.66 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.223 225859 DEBUG oslo.service.loopingcall [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.223 225859 DEBUG nova.compute.manager [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:33:23 np0005588919 nova_compute[225855]: 2026-01-20 15:33:23.223 225859 DEBUG nova.network.neutron [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:33:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.163 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.479 225859 DEBUG nova.network.neutron [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.499 225859 INFO nova.compute.manager [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Took 1.28 seconds to deallocate network for instance.#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.555 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.555 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.565 225859 DEBUG nova.compute.manager [req-eb6c2f2a-20fe-45b0-8efb-4fd4d1615165 req-e00c9aa2-fe15-4b77-b0f0-7d5a0229b768 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-deleted-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:33:24 np0005588919 nova_compute[225855]: 2026-01-20 15:33:24.600 225859 DEBUG oslo_concurrency.processutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:33:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:24.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:24.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:25 np0005588919 podman[322565]: 2026-01-20 15:33:25.019964478 +0000 UTC m=+0.060832678 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 20 10:33:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:33:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2267802654' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.053 225859 DEBUG oslo_concurrency.processutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.060 225859 DEBUG nova.compute.provider_tree [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.101 225859 DEBUG nova.scheduler.client.report [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.168 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.197 225859 INFO nova.scheduler.client.report [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 054e01d8-c9d1-4fb3-99e1-d417718d48c9#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.253 225859 DEBUG nova.compute.manager [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.254 225859 DEBUG oslo_concurrency.lockutils [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.254 225859 DEBUG oslo_concurrency.lockutils [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.255 225859 DEBUG oslo_concurrency.lockutils [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.255 225859 DEBUG nova.compute.manager [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] No waiting events found dispatching network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.256 225859 WARNING nova.compute.manager [req-acca13a9-b37c-440e-a2af-6d207ff4ce61 req-56250356-8b17-467f-b81b-ad9be27f6109 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Received unexpected event network-vif-plugged-71bbd457-6ff9-4170-b4f0-18fb471606d4 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.278 225859 DEBUG oslo_concurrency.lockutils [None req-14700326-29a8-4b3d-b34a-acbcbf3acbaa 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "054e01d8-c9d1-4fb3-99e1-d417718d48c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:33:25 np0005588919 nova_compute[225855]: 2026-01-20 15:33:25.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:26.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:26.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:27.747 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:33:27 np0005588919 nova_compute[225855]: 2026-01-20 15:33:27.748 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:27.748 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:33:27 np0005588919 nova_compute[225855]: 2026-01-20 15:33:27.816 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:28.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:28.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:29 np0005588919 nova_compute[225855]: 2026-01-20 15:33:29.166 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:30 np0005588919 nova_compute[225855]: 2026-01-20 15:33:30.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:30.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:30.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:32.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:32 np0005588919 nova_compute[225855]: 2026-01-20 15:33:32.820 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:32.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:34 np0005588919 nova_compute[225855]: 2026-01-20 15:33:34.168 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:34 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:33:34.751 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:33:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:34.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:34.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:33:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:36.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:33:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:37 np0005588919 nova_compute[225855]: 2026-01-20 15:33:37.795 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923202.794088, 054e01d8-c9d1-4fb3-99e1-d417718d48c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:33:37 np0005588919 nova_compute[225855]: 2026-01-20 15:33:37.796 225859 INFO nova.compute.manager [-] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:33:37 np0005588919 nova_compute[225855]: 2026-01-20 15:33:37.823 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:37 np0005588919 nova_compute[225855]: 2026-01-20 15:33:37.836 225859 DEBUG nova.compute.manager [None req-f7441665-7223-4c0b-ae47-2f33138da596 - - - - - -] [instance: 054e01d8-c9d1-4fb3-99e1-d417718d48c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:33:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:33:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:38.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:33:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:38.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:39 np0005588919 nova_compute[225855]: 2026-01-20 15:33:39.169 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:39 np0005588919 nova_compute[225855]: 2026-01-20 15:33:39.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:33:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:40.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:40.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:42.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:42 np0005588919 nova_compute[225855]: 2026-01-20 15:33:42.826 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:42.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:44 np0005588919 nova_compute[225855]: 2026-01-20 15:33:44.169 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:44.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:46 np0005588919 podman[322647]: 2026-01-20 15:33:46.033415199 +0000 UTC m=+0.083682721 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:33:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:46.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:46.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:47 np0005588919 nova_compute[225855]: 2026-01-20 15:33:47.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:48.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:48.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:49 np0005588919 nova_compute[225855]: 2026-01-20 15:33:49.170 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:50.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:50.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:52.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:52.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:52 np0005588919 nova_compute[225855]: 2026-01-20 15:33:52.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:54 np0005588919 nova_compute[225855]: 2026-01-20 15:33:54.172 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:54.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:54.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:56 np0005588919 podman[322728]: 2026-01-20 15:33:56.009088045 +0000 UTC m=+0.055894677 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:33:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:56.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:56.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:57 np0005588919 nova_compute[225855]: 2026-01-20 15:33:57.882 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:33:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:33:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:33:57 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:33:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:33:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:33:58.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:33:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:33:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:33:58.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:33:59 np0005588919 nova_compute[225855]: 2026-01-20 15:33:59.174 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:00.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:00.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:02.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:02.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:02 np0005588919 nova_compute[225855]: 2026-01-20 15:34:02.884 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:04 np0005588919 nova_compute[225855]: 2026-01-20 15:34:04.176 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:34:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:34:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:04.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:04.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:06.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:06.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:07 np0005588919 nova_compute[225855]: 2026-01-20 15:34:07.886 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:08.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:08.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:09 np0005588919 nova_compute[225855]: 2026-01-20 15:34:09.178 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:10.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:10.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:12.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:12.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:12 np0005588919 nova_compute[225855]: 2026-01-20 15:34:12.890 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:14 np0005588919 nova_compute[225855]: 2026-01-20 15:34:14.179 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:14 np0005588919 nova_compute[225855]: 2026-01-20 15:34:14.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:14 np0005588919 nova_compute[225855]: 2026-01-20 15:34:14.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:34:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.399 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.400 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.400 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.400 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.401 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281346832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:16 np0005588919 nova_compute[225855]: 2026-01-20 15:34:16.833 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:16.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:16.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:16 np0005588919 podman[323011]: 2026-01-20 15:34:16.955660875 +0000 UTC m=+0.084055791 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.044 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.046 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4268MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.047 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.048 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.135 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.136 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.177 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.212 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.213 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.258 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.281 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.304 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/431993206' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.770 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.776 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.809 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.860 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.861 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:17 np0005588919 nova_compute[225855]: 2026-01-20 15:34:17.892 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:18 np0005588919 nova_compute[225855]: 2026-01-20 15:34:18.861 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:18 np0005588919 nova_compute[225855]: 2026-01-20 15:34:18.862 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:34:18 np0005588919 nova_compute[225855]: 2026-01-20 15:34:18.862 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:34:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:18 np0005588919 nova_compute[225855]: 2026-01-20 15:34:18.876 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:34:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:18.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:19 np0005588919 nova_compute[225855]: 2026-01-20 15:34:19.181 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:19 np0005588919 nova_compute[225855]: 2026-01-20 15:34:19.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.611994) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259612038, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1355, "num_deletes": 251, "total_data_size": 3006932, "memory_usage": 3053952, "flush_reason": "Manual Compaction"}
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259739527, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 1962624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80589, "largest_seqno": 81939, "table_properties": {"data_size": 1956773, "index_size": 3181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12789, "raw_average_key_size": 20, "raw_value_size": 1944902, "raw_average_value_size": 3062, "num_data_blocks": 140, "num_entries": 635, "num_filter_entries": 635, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923153, "oldest_key_time": 1768923153, "file_creation_time": 1768923259, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 127617 microseconds, and 6022 cpu microseconds.
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.739603) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 1962624 bytes OK
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.739636) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.817032) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.817108) EVENT_LOG_v1 {"time_micros": 1768923259817093, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.817143) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 3000532, prev total WAL file size 3000532, number of live WAL files 2.
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.818345) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(1916KB)], [165(12MB)]
Jan 20 10:34:19 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923259818418, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 14931954, "oldest_snapshot_seqno": -1}
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 10231 keys, 12957264 bytes, temperature: kUnknown
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260302451, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 12957264, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12891458, "index_size": 39133, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 270322, "raw_average_key_size": 26, "raw_value_size": 12712502, "raw_average_value_size": 1242, "num_data_blocks": 1487, "num_entries": 10231, "num_filter_entries": 10231, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923259, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:34:20 np0005588919 nova_compute[225855]: 2026-01-20 15:34:20.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:20 np0005588919 nova_compute[225855]: 2026-01-20 15:34:20.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:20 np0005588919 nova_compute[225855]: 2026-01-20 15:34:20.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.302965) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 12957264 bytes
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.457712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 30.8 rd, 26.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(14.2) write-amplify(6.6) OK, records in: 10754, records dropped: 523 output_compression: NoCompression
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.457755) EVENT_LOG_v1 {"time_micros": 1768923260457740, "job": 106, "event": "compaction_finished", "compaction_time_micros": 484170, "compaction_time_cpu_micros": 47587, "output_level": 6, "num_output_files": 1, "total_output_size": 12957264, "num_input_records": 10754, "num_output_records": 10231, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260458324, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923260460899, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:19.818228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.461013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.461021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.461025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.461029) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:34:20.461033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:34:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:20.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:22 np0005588919 nova_compute[225855]: 2026-01-20 15:34:22.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:22.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:22 np0005588919 nova_compute[225855]: 2026-01-20 15:34:22.895 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:23 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:23Z|00971|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 20 10:34:24 np0005588919 nova_compute[225855]: 2026-01-20 15:34:24.183 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:24.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:24.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:25 np0005588919 nova_compute[225855]: 2026-01-20 15:34:25.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:26.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:27 np0005588919 podman[323065]: 2026-01-20 15:34:27.00583491 +0000 UTC m=+0.055862436 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:34:27 np0005588919 nova_compute[225855]: 2026-01-20 15:34:27.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:28.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:28.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:29 np0005588919 nova_compute[225855]: 2026-01-20 15:34:29.186 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:30.215 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:30 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:30.216 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:34:30 np0005588919 nova_compute[225855]: 2026-01-20 15:34:30.216 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:30.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:30.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:31 np0005588919 nova_compute[225855]: 2026-01-20 15:34:31.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:32.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:32.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:32 np0005588919 nova_compute[225855]: 2026-01-20 15:34:32.900 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:34 np0005588919 nova_compute[225855]: 2026-01-20 15:34:34.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:34.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:34.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:36 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:36.218 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:36.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:36.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:37 np0005588919 nova_compute[225855]: 2026-01-20 15:34:37.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:34:37 np0005588919 nova_compute[225855]: 2026-01-20 15:34:37.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:34:37 np0005588919 nova_compute[225855]: 2026-01-20 15:34:37.903 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:38.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:38.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:39 np0005588919 nova_compute[225855]: 2026-01-20 15:34:39.189 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.677 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.677 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.703 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.811 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.812 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.819 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.819 225859 INFO nova.compute.claims [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:34:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:40.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6e4d6f0 =====
Jan 20 10:34:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6e4d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:40 np0005588919 radosgw[83787]: beast: 0x7f09c6e4d6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:40.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:40 np0005588919 nova_compute[225855]: 2026-01-20 15:34:40.916 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3059552772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.365 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.371 225859 DEBUG nova.compute.provider_tree [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.555 225859 DEBUG nova.scheduler.client.report [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.843 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.844 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.946 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:34:41 np0005588919 nova_compute[225855]: 2026-01-20 15:34:41.947 225859 DEBUG nova.network.neutron [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:34:42 np0005588919 nova_compute[225855]: 2026-01-20 15:34:42.035 225859 INFO nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:34:42 np0005588919 nova_compute[225855]: 2026-01-20 15:34:42.321 225859 DEBUG nova.policy [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5338aa65dc0e4326a66ce79053787f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:34:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6e4d6f0 =====
Jan 20 10:34:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:42.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6e4d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:42 np0005588919 radosgw[83787]: beast: 0x7f09c6e4d6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:42.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:42 np0005588919 nova_compute[225855]: 2026-01-20 15:34:42.906 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.190 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.400 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.693 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.694 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.695 225859 INFO nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Creating image(s)#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.723 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.753 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.778 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.782 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.852 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.854 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.855 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.855 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.880 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:44 np0005588919 nova_compute[225855]: 2026-01-20 15:34:44.884 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:44.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.621 225859 DEBUG nova.network.neutron [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Successfully updated port: 1dee9c67-fb01-4fcd-8f35-805a326ee235 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.643 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.644 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquired lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.644 225859 DEBUG nova.network.neutron [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.849 225859 DEBUG nova.compute.manager [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.850 225859 DEBUG nova.compute.manager [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Refreshing instance network info cache due to event network-changed-1dee9c67-fb01-4fcd-8f35-805a326ee235. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.850 225859 DEBUG oslo_concurrency.lockutils [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.913 225859 DEBUG nova.network.neutron [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:34:45 np0005588919 nova_compute[225855]: 2026-01-20 15:34:45.955 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.054 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] resizing rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.648 225859 DEBUG nova.objects.instance [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'migration_context' on Instance uuid 2003d484-9afb-4f49-8410-6e8c6aa813d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.667 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.668 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Ensure instance console log exists: /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.668 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.669 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.669 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:46.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.963 225859 DEBUG nova.network.neutron [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.994 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Releasing lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.994 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Instance network_info: |[{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.995 225859 DEBUG oslo_concurrency.lockutils [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.995 225859 DEBUG nova.network.neutron [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Refreshing network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:34:46 np0005588919 nova_compute[225855]: 2026-01-20 15:34:46.997 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Start _get_guest_xml network_info=[{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.002 225859 WARNING nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.006 225859 DEBUG nova.virt.libvirt.host [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.006 225859 DEBUG nova.virt.libvirt.host [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.011 225859 DEBUG nova.virt.libvirt.host [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.011 225859 DEBUG nova.virt.libvirt.host [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.012 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.012 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.012 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.012 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.013 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.014 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.014 225859 DEBUG nova.virt.hardware [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.017 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:34:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3471853459' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.471 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.500 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.505 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.908 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:47 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:34:47 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3172465850' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.937 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.939 225859 DEBUG nova.virt.libvirt.vif [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:34:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1912465458',display_name='tempest-TestNetworkBasicOps-server-1912465458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1912465458',id=211,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCclivZajv/oNiXd0J0tpc9M442c8dbYXCbsYeHEo3g2nh4Rcq6ISUBBO6XIX8RmCdEtQzJtRlazxR/MdQkZGMMo5bsdyOhXnm5vgMIIsHetJR9AEpVwxFDAVbRX9E2EQ==',key_name='tempest-TestNetworkBasicOps-204665299',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-g29h7bs4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:34:44Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=2003d484-9afb-4f49-8410-6e8c6aa813d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.940 225859 DEBUG nova.network.os_vif_util [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.940 225859 DEBUG nova.network.os_vif_util [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.941 225859 DEBUG nova.objects.instance [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'pci_devices' on Instance uuid 2003d484-9afb-4f49-8410-6e8c6aa813d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.959 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <uuid>2003d484-9afb-4f49-8410-6e8c6aa813d0</uuid>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <name>instance-000000d3</name>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:name>tempest-TestNetworkBasicOps-server-1912465458</nova:name>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:34:47</nova:creationTime>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:user uuid="5338aa65dc0e4326a66ce79053787f14">tempest-TestNetworkBasicOps-807695970-project-member</nova:user>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:project uuid="3168f57421fb49bfb94b85daedd1fe7d">tempest-TestNetworkBasicOps-807695970</nova:project>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <nova:port uuid="1dee9c67-fb01-4fcd-8f35-805a326ee235">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="serial">2003d484-9afb-4f49-8410-6e8c6aa813d0</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="uuid">2003d484-9afb-4f49-8410-6e8c6aa813d0</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2003d484-9afb-4f49-8410-6e8c6aa813d0_disk">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:ea:3a:75"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <target dev="tap1dee9c67-fb"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/console.log" append="off"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:34:47 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:34:47 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:34:47 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:34:47 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.961 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Preparing to wait for external event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.962 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.962 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.962 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.963 225859 DEBUG nova.virt.libvirt.vif [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:34:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1912465458',display_name='tempest-TestNetworkBasicOps-server-1912465458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1912465458',id=211,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCclivZajv/oNiXd0J0tpc9M442c8dbYXCbsYeHEo3g2nh4Rcq6ISUBBO6XIX8RmCdEtQzJtRlazxR/MdQkZGMMo5bsdyOhXnm5vgMIIsHetJR9AEpVwxFDAVbRX9E2EQ==',key_name='tempest-TestNetworkBasicOps-204665299',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-g29h7bs4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:34:44Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=2003d484-9afb-4f49-8410-6e8c6aa813d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.963 225859 DEBUG nova.network.os_vif_util [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.964 225859 DEBUG nova.network.os_vif_util [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.965 225859 DEBUG os_vif [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.965 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.966 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.966 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.970 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1dee9c67-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.970 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1dee9c67-fb, col_values=(('external_ids', {'iface-id': '1dee9c67-fb01-4fcd-8f35-805a326ee235', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:3a:75', 'vm-uuid': '2003d484-9afb-4f49-8410-6e8c6aa813d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.972 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:47 np0005588919 NetworkManager[49104]: <info>  [1768923287.9727] manager: (tap1dee9c67-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/414)
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.974 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.981 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:47 np0005588919 nova_compute[225855]: 2026-01-20 15:34:47.982 225859 INFO os_vif [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb')#033[00m
Jan 20 10:34:48 np0005588919 podman[323395]: 2026-01-20 15:34:48.056031891 +0000 UTC m=+0.103904297 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 10:34:48 np0005588919 nova_compute[225855]: 2026-01-20 15:34:48.080 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:34:48 np0005588919 nova_compute[225855]: 2026-01-20 15:34:48.081 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:34:48 np0005588919 nova_compute[225855]: 2026-01-20 15:34:48.081 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] No VIF found with MAC fa:16:3e:ea:3a:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:34:48 np0005588919 nova_compute[225855]: 2026-01-20 15:34:48.081 225859 INFO nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Using config drive#033[00m
Jan 20 10:34:48 np0005588919 nova_compute[225855]: 2026-01-20 15:34:48.106 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:48.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:48.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.791 225859 INFO nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Creating config drive at /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config#033[00m
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.795 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvw16danw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.927 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvw16danw" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.960 225859 DEBUG nova.storage.rbd_utils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] rbd image 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:34:49 np0005588919 nova_compute[225855]: 2026-01-20 15:34:49.963 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.092 225859 DEBUG oslo_concurrency.processutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config 2003d484-9afb-4f49-8410-6e8c6aa813d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.093 225859 INFO nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Deleting local config drive /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0/disk.config because it was imported into RBD.#033[00m
Jan 20 10:34:50 np0005588919 kernel: tap1dee9c67-fb: entered promiscuous mode
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.1465] manager: (tap1dee9c67-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/415)
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.146 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:50Z|00972|binding|INFO|Claiming lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 for this chassis.
Jan 20 10:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:50Z|00973|binding|INFO|1dee9c67-fb01-4fcd-8f35-805a326ee235: Claiming fa:16:3e:ea:3a:75 10.100.0.12
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.151 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.154 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.1599] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.159 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.1613] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.164 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:3a:75 10.100.0.12'], port_security=['fa:16:3e:ea:3a:75 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2003d484-9afb-4f49-8410-6e8c6aa813d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99dd5684-1685-443e-9373-f548d80784f6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d2010b-16ff-4152-8c6b-d6e8ffb1b3ca, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=1dee9c67-fb01-4fcd-8f35-805a326ee235) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.165 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 1dee9c67-fb01-4fcd-8f35-805a326ee235 in datapath 99dd5684-1685-443e-9373-f548d80784f6 bound to our chassis#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.166 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99dd5684-1685-443e-9373-f548d80784f6#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.179 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a1626301-7cca-4e44-b928-d7261722b327]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.180 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99dd5684-11 in ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.182 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99dd5684-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.183 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5b33ad76-597b-4281-8120-4d15816757b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 systemd-udevd[323544]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.184 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a0390ff4-b446-459b-9da8-c5d2e9d15f9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 systemd-machined[194361]: New machine qemu-112-instance-000000d3.
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.195 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[43159564-1b2d-4949-8569-2b62ecb0404f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.1995] device (tap1dee9c67-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.2000] device (tap1dee9c67-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.219 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[05219e07-e736-48e5-96ad-d5a7493a67e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 systemd[1]: Started Virtual Machine qemu-112-instance-000000d3.
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.235 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.246 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:50Z|00974|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 ovn-installed in OVS
Jan 20 10:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:50Z|00975|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 up in Southbound
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.258 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.258 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[060fce39-a7ba-4bee-ace7-aae44275fcf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.2650] manager: (tap99dd5684-10): new Veth device (/org/freedesktop/NetworkManager/Devices/418)
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.264 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[763688d4-fc26-4824-af80-52fe132b68c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.298 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[6a61aa99-d7f9-4cff-87a6-e54e14889afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.301 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[febd096e-0140-4532-9df3-2c22803dd2be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.3244] device (tap99dd5684-10): carrier: link connected
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.330 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[06989b20-ae79-485d-952f-e34de152a4b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.349 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c88d7e28-1d10-402a-9d13-303f5ba0f16d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99dd5684-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:88:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 277], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832925, 'reachable_time': 26050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323576, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.367 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a57d040f-109a-4fa5-a5bf-e178eedb4e5d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:8896'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 832925, 'tstamp': 832925}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 323577, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.383 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4d60f09d-753e-4660-876b-7ede5a0335c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99dd5684-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:88:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 277], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832925, 'reachable_time': 26050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 323578, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.411 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec7091a-c009-4e82-bd14-4cd3a45c09aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.469 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b4ef91-b5e9-42c3-bcf8-e64b0cc42428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.471 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99dd5684-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.471 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.471 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99dd5684-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 NetworkManager[49104]: <info>  [1768923290.4738] manager: (tap99dd5684-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Jan 20 10:34:50 np0005588919 kernel: tap99dd5684-10: entered promiscuous mode
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.477 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99dd5684-10, col_values=(('external_ids', {'iface-id': 'b36be382-7937-4c5c-b0f7-fc4a6e68a050'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:50 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:50Z|00976|binding|INFO|Releasing lport b36be382-7937-4c5c-b0f7-fc4a6e68a050 from this chassis (sb_readonly=0)
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.478 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.480 225859 DEBUG nova.network.neutron [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Updated VIF entry in instance network info cache for port 1dee9c67-fb01-4fcd-8f35-805a326ee235. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.480 225859 DEBUG nova.network.neutron [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Updating instance_info_cache with network_info: [{"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.493 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.494 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.494 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[333cb462-ca21-49b7-a4b5-0de0ad3343d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.495 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-99dd5684-1685-443e-9373-f548d80784f6
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/99dd5684-1685-443e-9373-f548d80784f6.pid.haproxy
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 99dd5684-1685-443e-9373-f548d80784f6
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:34:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:50.496 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'env', 'PROCESS_TAG=haproxy-99dd5684-1685-443e-9373-f548d80784f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99dd5684-1685-443e-9373-f548d80784f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.502 225859 DEBUG oslo_concurrency.lockutils [req-8eb53864-9c37-4448-b563-cc75612a40ab req-243503e8-488f-4f11-8943-44df69856886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-2003d484-9afb-4f49-8410-6e8c6aa813d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.569 225859 DEBUG nova.compute.manager [req-51a38df6-e2c9-4651-ad85-cc9a6579735d req-5fda12b5-aeea-4d19-b4d8-c74f482f92a8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.570 225859 DEBUG oslo_concurrency.lockutils [req-51a38df6-e2c9-4651-ad85-cc9a6579735d req-5fda12b5-aeea-4d19-b4d8-c74f482f92a8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.570 225859 DEBUG oslo_concurrency.lockutils [req-51a38df6-e2c9-4651-ad85-cc9a6579735d req-5fda12b5-aeea-4d19-b4d8-c74f482f92a8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.570 225859 DEBUG oslo_concurrency.lockutils [req-51a38df6-e2c9-4651-ad85-cc9a6579735d req-5fda12b5-aeea-4d19-b4d8-c74f482f92a8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.571 225859 DEBUG nova.compute.manager [req-51a38df6-e2c9-4651-ad85-cc9a6579735d req-5fda12b5-aeea-4d19-b4d8-c74f482f92a8 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Processing event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.639 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923290.6387074, 2003d484-9afb-4f49-8410-6e8c6aa813d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.639 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] VM Started (Lifecycle Event)#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.642 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.647 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.650 225859 INFO nova.virt.libvirt.driver [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Instance spawned successfully.#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.650 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.662 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.667 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.672 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.672 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.673 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.673 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.674 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.674 225859 DEBUG nova.virt.libvirt.driver [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.723 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.724 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923290.6388218, 2003d484-9afb-4f49-8410-6e8c6aa813d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.724 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.755 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.759 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923290.64678, 2003d484-9afb-4f49-8410-6e8c6aa813d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.760 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.768 225859 INFO nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Took 6.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.770 225859 DEBUG nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.778 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.782 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.805 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.839 225859 INFO nova.compute.manager [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Took 10.07 seconds to build instance.#033[00m
Jan 20 10:34:50 np0005588919 nova_compute[225855]: 2026-01-20 15:34:50.865 225859 DEBUG oslo_concurrency.lockutils [None req-73ba7bc4-81d4-4e3d-b438-3c908544fde7 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:50 np0005588919 podman[323652]: 2026-01-20 15:34:50.875519031 +0000 UTC m=+0.048556637 container create eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:34:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6e4d6f0 =====
Jan 20 10:34:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:50.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6e4d6f0 op status=0 http_status=200 latency=0.002000058s ======
Jan 20 10:34:50 np0005588919 radosgw[83787]: beast: 0x7f09c6e4d6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:50.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 20 10:34:50 np0005588919 systemd[1]: Started libpod-conmon-eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f.scope.
Jan 20 10:34:50 np0005588919 podman[323652]: 2026-01-20 15:34:50.849629252 +0000 UTC m=+0.022666878 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:34:50 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:34:50 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90f8910fb7a7fc3ab8ba620a83841e990aceea1fa270b9dab15095ed60da470a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:34:50 np0005588919 podman[323652]: 2026-01-20 15:34:50.965747807 +0000 UTC m=+0.138785433 container init eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 20 10:34:50 np0005588919 podman[323652]: 2026-01-20 15:34:50.971819131 +0000 UTC m=+0.144856737 container start eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 20 10:34:50 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [NOTICE]   (323671) : New worker (323673) forked
Jan 20 10:34:50 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [NOTICE]   (323671) : Loading success.
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.697 225859 DEBUG nova.compute.manager [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.697 225859 DEBUG oslo_concurrency.lockutils [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.697 225859 DEBUG oslo_concurrency.lockutils [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.698 225859 DEBUG oslo_concurrency.lockutils [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.698 225859 DEBUG nova.compute.manager [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] No waiting events found dispatching network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.698 225859 WARNING nova.compute.manager [req-2fc235bd-29a9-4a82-bae2-a3fd1dd5e952 req-6933f12d-e588-4c27-a5fa-0fe3063737ee 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received unexpected event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:34:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:52.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:52.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:52 np0005588919 nova_compute[225855]: 2026-01-20 15:34:52.973 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.242 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.243 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.243 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.244 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.244 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.246 225859 INFO nova.compute.manager [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Terminating instance#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.247 225859 DEBUG nova.compute.manager [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:34:53 np0005588919 kernel: tap1dee9c67-fb (unregistering): left promiscuous mode
Jan 20 10:34:53 np0005588919 NetworkManager[49104]: <info>  [1768923293.2867] device (tap1dee9c67-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:34:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:53Z|00977|binding|INFO|Releasing lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 from this chassis (sb_readonly=0)
Jan 20 10:34:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:53Z|00978|binding|INFO|Setting lport 1dee9c67-fb01-4fcd-8f35-805a326ee235 down in Southbound
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 ovn_controller[130490]: 2026-01-20T15:34:53Z|00979|binding|INFO|Removing iface tap1dee9c67-fb ovn-installed in OVS
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.306 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:3a:75 10.100.0.12'], port_security=['fa:16:3e:ea:3a:75 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2003d484-9afb-4f49-8410-6e8c6aa813d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99dd5684-1685-443e-9373-f548d80784f6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-703015767', 'neutron:project_id': '3168f57421fb49bfb94b85daedd1fe7d', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'aaa69ba6-9a27-441e-877e-2cd188322a42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.217', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44d2010b-16ff-4152-8c6b-d6e8ffb1b3ca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=1dee9c67-fb01-4fcd-8f35-805a326ee235) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.308 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 1dee9c67-fb01-4fcd-8f35-805a326ee235 in datapath 99dd5684-1685-443e-9373-f548d80784f6 unbound from our chassis#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.309 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99dd5684-1685-443e-9373-f548d80784f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.310 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ae15e720-870f-41a2-9706-4bcc74474b53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.310 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 namespace which is not needed anymore#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.320 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Jan 20 10:34:53 np0005588919 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d000000d3.scope: Consumed 3.055s CPU time.
Jan 20 10:34:53 np0005588919 systemd-machined[194361]: Machine qemu-112-instance-000000d3 terminated.
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [NOTICE]   (323671) : haproxy version is 2.8.14-c23fe91
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [NOTICE]   (323671) : path to executable is /usr/sbin/haproxy
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [WARNING]  (323671) : Exiting Master process...
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [WARNING]  (323671) : Exiting Master process...
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [ALERT]    (323671) : Current worker (323673) exited with code 143 (Terminated)
Jan 20 10:34:53 np0005588919 neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6[323667]: [WARNING]  (323671) : All workers exited. Exiting... (0)
Jan 20 10:34:53 np0005588919 systemd[1]: libpod-eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f.scope: Deactivated successfully.
Jan 20 10:34:53 np0005588919 podman[323707]: 2026-01-20 15:34:53.440667379 +0000 UTC m=+0.045963993 container died eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.467 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.472 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.481 225859 INFO nova.virt.libvirt.driver [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Instance destroyed successfully.#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.482 225859 DEBUG nova.objects.instance [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lazy-loading 'resources' on Instance uuid 2003d484-9afb-4f49-8410-6e8c6aa813d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.500 225859 DEBUG nova.virt.libvirt.vif [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:34:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1912465458',display_name='tempest-TestNetworkBasicOps-server-1912465458',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1912465458',id=211,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCclivZajv/oNiXd0J0tpc9M442c8dbYXCbsYeHEo3g2nh4Rcq6ISUBBO6XIX8RmCdEtQzJtRlazxR/MdQkZGMMo5bsdyOhXnm5vgMIIsHetJR9AEpVwxFDAVbRX9E2EQ==',key_name='tempest-TestNetworkBasicOps-204665299',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:34:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3168f57421fb49bfb94b85daedd1fe7d',ramdisk_id='',reservation_id='r-g29h7bs4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-807695970',owner_user_name='tempest-TestNetworkBasicOps-807695970-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:34:50Z,user_data=None,user_id='5338aa65dc0e4326a66ce79053787f14',uuid=2003d484-9afb-4f49-8410-6e8c6aa813d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.501 225859 DEBUG nova.network.os_vif_util [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converting VIF {"id": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "address": "fa:16:3e:ea:3a:75", "network": {"id": "99dd5684-1685-443e-9373-f548d80784f6", "bridge": "br-int", "label": "tempest-network-smoke--802350272", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3168f57421fb49bfb94b85daedd1fe7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dee9c67-fb", "ovs_interfaceid": "1dee9c67-fb01-4fcd-8f35-805a326ee235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.501 225859 DEBUG nova.network.os_vif_util [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.502 225859 DEBUG os_vif [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.503 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.503 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1dee9c67-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.504 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.505 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.507 225859 INFO os_vif [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:3a:75,bridge_name='br-int',has_traffic_filtering=True,id=1dee9c67-fb01-4fcd-8f35-805a326ee235,network=Network(99dd5684-1685-443e-9373-f548d80784f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1dee9c67-fb')#033[00m
Jan 20 10:34:53 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:34:53 np0005588919 systemd[1]: var-lib-containers-storage-overlay-90f8910fb7a7fc3ab8ba620a83841e990aceea1fa270b9dab15095ed60da470a-merged.mount: Deactivated successfully.
Jan 20 10:34:53 np0005588919 podman[323707]: 2026-01-20 15:34:53.625146186 +0000 UTC m=+0.230442780 container cleanup eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:34:53 np0005588919 systemd[1]: libpod-conmon-eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f.scope: Deactivated successfully.
Jan 20 10:34:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:53 np0005588919 podman[323767]: 2026-01-20 15:34:53.831259211 +0000 UTC m=+0.185017584 container remove eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.839 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd58055-2978-49c5-bd30-b1dd4d1979a9]: (4, ('Tue Jan 20 03:34:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 (eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f)\neafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f\nTue Jan 20 03:34:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 (eafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f)\neafc6272744e7f80e260b39d508cf6343a4ca7b82c03acb636647827481af26f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.841 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4ce807-8ec8-4577-af88-b103afe6ea66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.842 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99dd5684-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.844 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 kernel: tap99dd5684-10: left promiscuous mode
Jan 20 10:34:53 np0005588919 nova_compute[225855]: 2026-01-20 15:34:53.857 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.861 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf0a75f-3201-488c-b29d-9c49f7308b58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.876 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[35994666-406c-433e-8375-2ba56397bdde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.877 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9b078761-6b2c-47d6-bf48-ec314da0bc6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.894 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d0490458-4665-48eb-b15f-628c7d2c66a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832917, 'reachable_time': 24456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323783, 'error': None, 'target': 'ovnmeta-99dd5684-1685-443e-9373-f548d80784f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:53 np0005588919 systemd[1]: run-netns-ovnmeta\x2d99dd5684\x2d1685\x2d443e\x2d9373\x2df548d80784f6.mount: Deactivated successfully.
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.896 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99dd5684-1685-443e-9373-f548d80784f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:34:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:34:53.897 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[3abcade1-e462-42c8-b562-37e667bc455e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.202 225859 INFO nova.virt.libvirt.driver [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Deleting instance files /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0_del#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.203 225859 INFO nova.virt.libvirt.driver [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Deletion of /var/lib/nova/instances/2003d484-9afb-4f49-8410-6e8c6aa813d0_del complete#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.292 225859 INFO nova.compute.manager [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.292 225859 DEBUG oslo.service.loopingcall [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.292 225859 DEBUG nova.compute.manager [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.293 225859 DEBUG nova.network.neutron [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.828 225859 DEBUG nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.828 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.829 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.829 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.829 225859 DEBUG nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] No waiting events found dispatching network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.829 225859 DEBUG nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-vif-unplugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.830 225859 DEBUG nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.830 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.830 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.830 225859 DEBUG oslo_concurrency.lockutils [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.830 225859 DEBUG nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] No waiting events found dispatching network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:34:54 np0005588919 nova_compute[225855]: 2026-01-20 15:34:54.831 225859 WARNING nova.compute.manager [req-7d8ffcc1-67dd-4ff1-b833-05f5010007de req-a0e0c5f4-35f6-4b66-8668-40e688531957 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Received unexpected event network-vif-plugged-1dee9c67-fb01-4fcd-8f35-805a326ee235 for instance with vm_state active and task_state deleting.#033[00m
Jan 20 10:34:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:54.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:56.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:34:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:56.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:34:57 np0005588919 nova_compute[225855]: 2026-01-20 15:34:57.549 225859 DEBUG nova.network.neutron [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:34:57 np0005588919 nova_compute[225855]: 2026-01-20 15:34:57.569 225859 INFO nova.compute.manager [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Took 3.28 seconds to deallocate network for instance.#033[00m
Jan 20 10:34:57 np0005588919 nova_compute[225855]: 2026-01-20 15:34:57.618 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:34:57 np0005588919 nova_compute[225855]: 2026-01-20 15:34:57.618 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:34:57 np0005588919 nova_compute[225855]: 2026-01-20 15:34:57.685 225859 DEBUG oslo_concurrency.processutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:34:58 np0005588919 podman[323808]: 2026-01-20 15:34:58.00583234 +0000 UTC m=+0.050412040 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:34:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:34:58 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1237257214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.118 225859 DEBUG oslo_concurrency.processutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.124 225859 DEBUG nova.compute.provider_tree [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.167 225859 DEBUG nova.scheduler.client.report [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.209 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.259 225859 INFO nova.scheduler.client.report [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Deleted allocations for instance 2003d484-9afb-4f49-8410-6e8c6aa813d0#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.346 225859 DEBUG oslo_concurrency.lockutils [None req-b63da538-d914-4572-80f0-4968c7623ce0 5338aa65dc0e4326a66ce79053787f14 3168f57421fb49bfb94b85daedd1fe7d - - default default] Lock "2003d484-9afb-4f49-8410-6e8c6aa813d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:34:58 np0005588919 nova_compute[225855]: 2026-01-20 15:34:58.506 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:34:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:34:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:34:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:34:58.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:34:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:34:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:34:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:34:58.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:34:59 np0005588919 nova_compute[225855]: 2026-01-20 15:34:59.196 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:00.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:01 np0005588919 nova_compute[225855]: 2026-01-20 15:35:01.358 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:01 np0005588919 nova_compute[225855]: 2026-01-20 15:35:01.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:35:01 np0005588919 nova_compute[225855]: 2026-01-20 15:35:01.382 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:35:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:02.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:02.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:03 np0005588919 nova_compute[225855]: 2026-01-20 15:35:03.511 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:04 np0005588919 nova_compute[225855]: 2026-01-20 15:35:04.197 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:04.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:35:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:35:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:06.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:06.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:08 np0005588919 nova_compute[225855]: 2026-01-20 15:35:08.480 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923293.4797463, 2003d484-9afb-4f49-8410-6e8c6aa813d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:35:08 np0005588919 nova_compute[225855]: 2026-01-20 15:35:08.480 225859 INFO nova.compute.manager [-] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:35:08 np0005588919 nova_compute[225855]: 2026-01-20 15:35:08.515 225859 DEBUG nova.compute.manager [None req-fce46557-cd3f-47a5-a448-3729a39e0469 - - - - - -] [instance: 2003d484-9afb-4f49-8410-6e8c6aa813d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:35:08 np0005588919 nova_compute[225855]: 2026-01-20 15:35:08.516 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:08.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:08.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:09 np0005588919 nova_compute[225855]: 2026-01-20 15:35:09.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588919 nova_compute[225855]: 2026-01-20 15:35:10.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588919 nova_compute[225855]: 2026-01-20 15:35:10.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:10.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:35:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:12.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:13 np0005588919 nova_compute[225855]: 2026-01-20 15:35:13.519 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:35:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3805897643' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:35:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:35:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3805897643' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:35:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:14 np0005588919 nova_compute[225855]: 2026-01-20 15:35:14.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:14.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:15 np0005588919 nova_compute[225855]: 2026-01-20 15:35:15.364 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:15 np0005588919 nova_compute[225855]: 2026-01-20 15:35:15.365 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:16.457 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:16.458 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:16.458 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:16.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:16.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.401 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.402 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.422 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.423 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.423 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.423 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.423 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:35:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3132927951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:35:17 np0005588919 nova_compute[225855]: 2026-01-20 15:35:17.954 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.128 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.129 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4248MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.129 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.129 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.264 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.264 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.315 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.523 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:35:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/962082559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.752 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.758 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.772 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.795 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:35:18 np0005588919 nova_compute[225855]: 2026-01-20 15:35:18.796 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:35:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:18.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:18.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:19 np0005588919 podman[324121]: 2026-01-20 15:35:19.03153554 +0000 UTC m=+0.078522193 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:35:19 np0005588919 nova_compute[225855]: 2026-01-20 15:35:19.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:20 np0005588919 nova_compute[225855]: 2026-01-20 15:35:20.734 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:20.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:20.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:21 np0005588919 nova_compute[225855]: 2026-01-20 15:35:21.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:22 np0005588919 nova_compute[225855]: 2026-01-20 15:35:22.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:22 np0005588919 nova_compute[225855]: 2026-01-20 15:35:22.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:22.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:23 np0005588919 nova_compute[225855]: 2026-01-20 15:35:23.526 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:24 np0005588919 nova_compute[225855]: 2026-01-20 15:35:24.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:24 np0005588919 nova_compute[225855]: 2026-01-20 15:35:24.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:24.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:24.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:26.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:26.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:27 np0005588919 nova_compute[225855]: 2026-01-20 15:35:27.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:28 np0005588919 nova_compute[225855]: 2026-01-20 15:35:28.530 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:28.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:28.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:29 np0005588919 podman[324152]: 2026-01-20 15:35:29.005649691 +0000 UTC m=+0.054521398 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:35:29 np0005588919 nova_compute[225855]: 2026-01-20 15:35:29.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:30.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:30.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:32.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:32.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:33 np0005588919 nova_compute[225855]: 2026-01-20 15:35:33.577 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:34 np0005588919 nova_compute[225855]: 2026-01-20 15:35:34.204 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:34.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:34.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:36.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:36.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:38 np0005588919 nova_compute[225855]: 2026-01-20 15:35:38.580 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:38.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:39 np0005588919 nova_compute[225855]: 2026-01-20 15:35:39.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:40.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:40.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:42.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:42.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:43 np0005588919 nova_compute[225855]: 2026-01-20 15:35:43.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:35:43 np0005588919 nova_compute[225855]: 2026-01-20 15:35:43.584 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:44 np0005588919 nova_compute[225855]: 2026-01-20 15:35:44.241 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:44.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:44.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:46.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:48 np0005588919 nova_compute[225855]: 2026-01-20 15:35:48.587 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:48.747 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:35:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:48 np0005588919 nova_compute[225855]: 2026-01-20 15:35:48.747 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:48 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:48.749 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:35:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:48.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:35:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:48.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:35:49 np0005588919 nova_compute[225855]: 2026-01-20 15:35:49.242 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:49 np0005588919 ovn_controller[130490]: 2026-01-20T15:35:49Z|00980|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 20 10:35:50 np0005588919 podman[324252]: 2026-01-20 15:35:50.085636499 +0000 UTC m=+0.128509501 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:35:50 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:35:50.751 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:35:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:50.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:50.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:52.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:52.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:53 np0005588919 nova_compute[225855]: 2026-01-20 15:35:53.634 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:54 np0005588919 nova_compute[225855]: 2026-01-20 15:35:54.245 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:54.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:54.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:56.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:56.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:58 np0005588919 nova_compute[225855]: 2026-01-20 15:35:58.638 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:35:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:35:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:35:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:35:58.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:35:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:35:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:35:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:35:58.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:35:59 np0005588919 nova_compute[225855]: 2026-01-20 15:35:59.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:00 np0005588919 podman[324315]: 2026-01-20 15:35:59.999928443 +0000 UTC m=+0.048089694 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 20 10:36:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:00.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:00.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:02.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:02.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:03 np0005588919 nova_compute[225855]: 2026-01-20 15:36:03.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:04 np0005588919 nova_compute[225855]: 2026-01-20 15:36:04.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:05.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:06.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:07.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:08 np0005588919 nova_compute[225855]: 2026-01-20 15:36:08.673 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:08.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:09.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:09 np0005588919 nova_compute[225855]: 2026-01-20 15:36:09.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:10.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:11.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:36:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:36:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:12.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:13.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:13 np0005588919 nova_compute[225855]: 2026-01-20 15:36:13.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:14 np0005588919 nova_compute[225855]: 2026-01-20 15:36:14.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:15.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:15.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:36:16.458 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:36:16.458 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:36:16.458 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:17.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:17.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:17 np0005588919 nova_compute[225855]: 2026-01-20 15:36:17.359 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:36:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:36:18 np0005588919 nova_compute[225855]: 2026-01-20 15:36:18.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:19.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:19.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.377 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1836750772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.822 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.988 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.989 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4262MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.990 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:36:19 np0005588919 nova_compute[225855]: 2026-01-20 15:36:19.990 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.057 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.057 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.073 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:36:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:36:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2691880414' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.529 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.535 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.551 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.553 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:36:20 np0005588919 nova_compute[225855]: 2026-01-20 15:36:20.553 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:36:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:21.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:21.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:21 np0005588919 podman[324620]: 2026-01-20 15:36:21.072805297 +0000 UTC m=+0.119596006 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:36:21 np0005588919 nova_compute[225855]: 2026-01-20 15:36:21.554 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:22 np0005588919 nova_compute[225855]: 2026-01-20 15:36:22.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:23.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:23.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:23 np0005588919 nova_compute[225855]: 2026-01-20 15:36:23.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:23 np0005588919 nova_compute[225855]: 2026-01-20 15:36:23.728 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:24 np0005588919 nova_compute[225855]: 2026-01-20 15:36:24.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:24 np0005588919 nova_compute[225855]: 2026-01-20 15:36:24.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:24 np0005588919 nova_compute[225855]: 2026-01-20 15:36:24.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:25.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:25.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:27.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:27.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:28 np0005588919 nova_compute[225855]: 2026-01-20 15:36:28.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:36:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:28 np0005588919 nova_compute[225855]: 2026-01-20 15:36:28.776 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:29.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:29.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:29 np0005588919 nova_compute[225855]: 2026-01-20 15:36:29.257 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:30 np0005588919 podman[324676]: 2026-01-20 15:36:30.314286741 +0000 UTC m=+0.052503880 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:36:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:31.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:31.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:33.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:33.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:33 np0005588919 nova_compute[225855]: 2026-01-20 15:36:33.779 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:34 np0005588919 nova_compute[225855]: 2026-01-20 15:36:34.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:35.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:35.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:37.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:37.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:38 np0005588919 nova_compute[225855]: 2026-01-20 15:36:38.806 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:39.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:39.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:39 np0005588919 nova_compute[225855]: 2026-01-20 15:36:39.261 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:41.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:41.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:43.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:43.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:43 np0005588919 nova_compute[225855]: 2026-01-20 15:36:43.810 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:44 np0005588919 nova_compute[225855]: 2026-01-20 15:36:44.261 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:45.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:45.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:36:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:36:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:47.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:48 np0005588919 nova_compute[225855]: 2026-01-20 15:36:48.814 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:49.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:49.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:49 np0005588919 nova_compute[225855]: 2026-01-20 15:36:49.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:50 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:36:50 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:36:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:51.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:51.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:52 np0005588919 podman[324783]: 2026-01-20 15:36:52.029003556 +0000 UTC m=+0.079664514 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 20 10:36:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:53.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:53.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:53 np0005588919 nova_compute[225855]: 2026-01-20 15:36:53.818 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:54 np0005588919 nova_compute[225855]: 2026-01-20 15:36:54.265 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:55.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:55.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:57.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:57.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:36:58 np0005588919 nova_compute[225855]: 2026-01-20 15:36:58.822 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:36:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:36:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:36:59.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:36:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:36:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:36:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:36:59.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:36:59 np0005588919 nova_compute[225855]: 2026-01-20 15:36:59.267 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:00 np0005588919 podman[324813]: 2026-01-20 15:37:00.994832098 +0000 UTC m=+0.047383170 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 20 10:37:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:01.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:01.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:03.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:03.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:03 np0005588919 nova_compute[225855]: 2026-01-20 15:37:03.876 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:04 np0005588919 nova_compute[225855]: 2026-01-20 15:37:04.270 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:05.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:07.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:07.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:08 np0005588919 nova_compute[225855]: 2026-01-20 15:37:08.879 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:09.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:09.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:09 np0005588919 nova_compute[225855]: 2026-01-20 15:37:09.272 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:11.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:11.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:13.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:13.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:37:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3673359103' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:37:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3673359103' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:37:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:13 np0005588919 nova_compute[225855]: 2026-01-20 15:37:13.883 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:14 np0005588919 nova_compute[225855]: 2026-01-20 15:37:14.274 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:15.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:15.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:16.459 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:16.459 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:16.459 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:17.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:17.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:17 np0005588919 nova_compute[225855]: 2026-01-20 15:37:17.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:17 np0005588919 nova_compute[225855]: 2026-01-20 15:37:17.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:37:18 np0005588919 nova_compute[225855]: 2026-01-20 15:37:18.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:18 np0005588919 nova_compute[225855]: 2026-01-20 15:37:18.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:37:18 np0005588919 nova_compute[225855]: 2026-01-20 15:37:18.342 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:37:18 np0005588919 nova_compute[225855]: 2026-01-20 15:37:18.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:37:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:18 np0005588919 nova_compute[225855]: 2026-01-20 15:37:18.887 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:19.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:19.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:19 np0005588919 nova_compute[225855]: 2026-01-20 15:37:19.276 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:37:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.367 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.368 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.368 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.368 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:37:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:37:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1298775489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.797 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.951 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.952 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4274MB free_disk=20.921852111816406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.952 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:37:20 np0005588919 nova_compute[225855]: 2026-01-20 15:37:20.953 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.064 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.064 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:37:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:21.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.081 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:37:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:21.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:37:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/351549409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.518 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.524 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.538 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.541 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:37:21 np0005588919 nova_compute[225855]: 2026-01-20 15:37:21.541 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:37:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:37:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 67K writes, 258K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.04 MB/s#012Cumulative WAL: 67K writes, 25K syncs, 2.63 writes per sync, written: 0.24 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2415 writes, 9201 keys, 2415 commit groups, 1.0 writes per commit group, ingest: 8.90 MB, 0.01 MB/s#012Interval WAL: 2415 writes, 999 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 0.000216 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 0.000216 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction,
Jan 20 10:37:23 np0005588919 podman[325071]: 2026-01-20 15:37:23.07542054 +0000 UTC m=+0.120249190 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:37:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:23.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:23 np0005588919 nova_compute[225855]: 2026-01-20 15:37:23.541 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:23 np0005588919 nova_compute[225855]: 2026-01-20 15:37:23.890 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:24 np0005588919 nova_compute[225855]: 2026-01-20 15:37:24.279 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:24 np0005588919 nova_compute[225855]: 2026-01-20 15:37:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:25.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:25.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:25 np0005588919 nova_compute[225855]: 2026-01-20 15:37:25.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:25 np0005588919 nova_compute[225855]: 2026-01-20 15:37:25.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:26 np0005588919 nova_compute[225855]: 2026-01-20 15:37:26.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:37:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:27.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:27.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:28 np0005588919 nova_compute[225855]: 2026-01-20 15:37:28.893 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:29.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:29.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:29 np0005588919 nova_compute[225855]: 2026-01-20 15:37:29.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:29 np0005588919 nova_compute[225855]: 2026-01-20 15:37:29.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:31.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:31.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:31.896 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:37:31 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:31.897 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:37:31 np0005588919 nova_compute[225855]: 2026-01-20 15:37:31.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:32 np0005588919 podman[325202]: 2026-01-20 15:37:32.022367538 +0000 UTC m=+0.063624890 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 20 10:37:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:33.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:33.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:33 np0005588919 nova_compute[225855]: 2026-01-20 15:37:33.929 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:34 np0005588919 nova_compute[225855]: 2026-01-20 15:37:34.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:35.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:35.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:37.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:37.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:38 np0005588919 nova_compute[225855]: 2026-01-20 15:37:38.933 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:39.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:39.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:39 np0005588919 nova_compute[225855]: 2026-01-20 15:37:39.283 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:41.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:41.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:41 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:37:41.899 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:37:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:43.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:43.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:43 np0005588919 nova_compute[225855]: 2026-01-20 15:37:43.936 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:44 np0005588919 nova_compute[225855]: 2026-01-20 15:37:44.285 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:45.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:45.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:47.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:47.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:48 np0005588919 nova_compute[225855]: 2026-01-20 15:37:48.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.421330) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468421360, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 2200, "num_deletes": 252, "total_data_size": 5409694, "memory_usage": 5485344, "flush_reason": "Manual Compaction"}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468444022, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 2036286, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 81944, "largest_seqno": 84139, "table_properties": {"data_size": 2029966, "index_size": 3201, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16729, "raw_average_key_size": 20, "raw_value_size": 2015882, "raw_average_value_size": 2513, "num_data_blocks": 145, "num_entries": 802, "num_filter_entries": 802, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923260, "oldest_key_time": 1768923260, "file_creation_time": 1768923468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 22764 microseconds, and 5032 cpu microseconds.
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.444088) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 2036286 bytes OK
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.444109) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.446024) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.446036) EVENT_LOG_v1 {"time_micros": 1768923468446033, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.446055) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 5400006, prev total WAL file size 5400006, number of live WAL files 2.
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.447347) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373537' seq:72057594037927935, type:22 .. '6D6772737461740033303130' seq:0, type:0; will stop at (end)
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(1988KB)], [168(12MB)]
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468447425, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14993550, "oldest_snapshot_seqno": -1}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 10615 keys, 12636913 bytes, temperature: kUnknown
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468593370, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 12636913, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12570509, "index_size": 38771, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 278523, "raw_average_key_size": 26, "raw_value_size": 12386874, "raw_average_value_size": 1166, "num_data_blocks": 1478, "num_entries": 10615, "num_filter_entries": 10615, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.593691) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 12636913 bytes
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.595469) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 102.7 rd, 86.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(13.6) write-amplify(6.2) OK, records in: 11033, records dropped: 418 output_compression: NoCompression
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.595493) EVENT_LOG_v1 {"time_micros": 1768923468595482, "job": 108, "event": "compaction_finished", "compaction_time_micros": 146025, "compaction_time_cpu_micros": 32018, "output_level": 6, "num_output_files": 1, "total_output_size": 12636913, "num_input_records": 11033, "num_output_records": 10615, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468596197, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923468599238, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.447207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.599352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.599360) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.599364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.599368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:37:48.599371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:37:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:48 np0005588919 nova_compute[225855]: 2026-01-20 15:37:48.940 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:49.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:49.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:49 np0005588919 nova_compute[225855]: 2026-01-20 15:37:49.286 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:51.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:51.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:37:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:53.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:37:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:53.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:53 np0005588919 nova_compute[225855]: 2026-01-20 15:37:53.943 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:54 np0005588919 podman[325280]: 2026-01-20 15:37:54.019970533 +0000 UTC m=+0.061793118 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:37:54 np0005588919 nova_compute[225855]: 2026-01-20 15:37:54.288 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:37:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 84K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1487 writes, 7136 keys, 1487 commit groups, 1.0 writes per commit group, ingest: 15.62 MB, 0.03 MB/s#012Interval WAL: 1487 writes, 1487 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     74.6      1.37              0.35        54    0.025       0      0       0.0       0.0#012  L6      1/0   12.05 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.3     96.5     82.7      6.55              1.74        53    0.124    407K    28K       0.0       0.0#012 Sum      1/0   12.05 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.3     79.8     81.3      7.92              2.09       107    0.074    407K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     56.5     56.1      1.20              0.24        10    0.120     53K   2443       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0     96.5     82.7      6.55              1.74        53    0.124    407K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     74.7      1.37              0.35        53    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.100, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.63 GB write, 0.11 MB/s write, 0.62 GB read, 0.11 MB/s read, 7.9 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 68.19 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000434 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3896,65.29 MB,21.4756%) FilterBlock(107,1.09 MB,0.358717%) IndexBlock(107,1.81 MB,0.596282%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:37:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:55.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:55.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:37:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:57.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:37:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:57.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:37:59 np0005588919 nova_compute[225855]: 2026-01-20 15:37:59.006 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:37:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:37:59.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:37:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:37:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:37:59.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:37:59 np0005588919 nova_compute[225855]: 2026-01-20 15:37:59.292 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:01.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:01.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:03 np0005588919 podman[325312]: 2026-01-20 15:38:03.006578301 +0000 UTC m=+0.049965214 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:38:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:03.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:03.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:04 np0005588919 nova_compute[225855]: 2026-01-20 15:38:04.011 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:04 np0005588919 nova_compute[225855]: 2026-01-20 15:38:04.293 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.686599) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484686646, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 396, "num_deletes": 251, "total_data_size": 409157, "memory_usage": 416728, "flush_reason": "Manual Compaction"}
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484736998, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 269559, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84145, "largest_seqno": 84535, "table_properties": {"data_size": 267271, "index_size": 451, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5634, "raw_average_key_size": 18, "raw_value_size": 262744, "raw_average_value_size": 867, "num_data_blocks": 20, "num_entries": 303, "num_filter_entries": 303, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923468, "oldest_key_time": 1768923468, "file_creation_time": 1768923484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 50442 microseconds, and 1914 cpu microseconds.
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.737045) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 269559 bytes OK
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.737064) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.810983) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811054) EVENT_LOG_v1 {"time_micros": 1768923484811037, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811093) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 406602, prev total WAL file size 406602, number of live WAL files 2.
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811927) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(263KB)], [171(12MB)]
Jan 20 10:38:04 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923484812093, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 12906472, "oldest_snapshot_seqno": -1}
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10408 keys, 10873904 bytes, temperature: kUnknown
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923485110497, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 10873904, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10810369, "index_size": 36414, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 274903, "raw_average_key_size": 26, "raw_value_size": 10631734, "raw_average_value_size": 1021, "num_data_blocks": 1371, "num_entries": 10408, "num_filter_entries": 10408, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923484, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.110726) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 10873904 bytes
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.112258) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 43.2 rd, 36.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 12.1 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(88.2) write-amplify(40.3) OK, records in: 10918, records dropped: 510 output_compression: NoCompression
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.112306) EVENT_LOG_v1 {"time_micros": 1768923485112290, "job": 110, "event": "compaction_finished", "compaction_time_micros": 298432, "compaction_time_cpu_micros": 26590, "output_level": 6, "num_output_files": 1, "total_output_size": 10873904, "num_input_records": 10918, "num_output_records": 10408, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923485112590, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923485115420, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:04.811780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.115602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.115613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.115616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.115620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:38:05.115624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:38:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:05.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:05.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:07.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:07.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:09 np0005588919 nova_compute[225855]: 2026-01-20 15:38:09.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:09.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:09.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:09 np0005588919 nova_compute[225855]: 2026-01-20 15:38:09.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:11.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:11.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:13.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:13.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:14 np0005588919 nova_compute[225855]: 2026-01-20 15:38:14.019 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:14 np0005588919 nova_compute[225855]: 2026-01-20 15:38:14.298 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:15.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:15.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:16.459 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:16.460 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:16.461 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:17.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:17.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:18 np0005588919 nova_compute[225855]: 2026-01-20 15:38:18.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:18 np0005588919 nova_compute[225855]: 2026-01-20 15:38:18.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:38:18 np0005588919 nova_compute[225855]: 2026-01-20 15:38:18.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:38:18 np0005588919 nova_compute[225855]: 2026-01-20 15:38:18.574 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:38:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:19 np0005588919 nova_compute[225855]: 2026-01-20 15:38:19.022 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:19.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:19.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:19 np0005588919 nova_compute[225855]: 2026-01-20 15:38:19.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:19 np0005588919 nova_compute[225855]: 2026-01-20 15:38:19.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:19 np0005588919 nova_compute[225855]: 2026-01-20 15:38:19.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:38:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:21.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:21.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.399 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.399 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.400 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.400 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.400 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:38:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:38:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3824174375' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:38:22 np0005588919 nova_compute[225855]: 2026-01-20 15:38:22.847 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.017 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.019 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4292MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.019 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.020 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:23.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:23.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.272 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.273 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.329 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:38:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:38:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/374964423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.862 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.867 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.958 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.959 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:38:23 np0005588919 nova_compute[225855]: 2026-01-20 15:38:23.959 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:24 np0005588919 nova_compute[225855]: 2026-01-20 15:38:24.024 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:24 np0005588919 nova_compute[225855]: 2026-01-20 15:38:24.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:24 np0005588919 nova_compute[225855]: 2026-01-20 15:38:24.959 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:25 np0005588919 podman[325437]: 2026-01-20 15:38:25.081470104 +0000 UTC m=+0.118173652 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:38:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:25.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:25.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:26 np0005588919 nova_compute[225855]: 2026-01-20 15:38:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:26 np0005588919 nova_compute[225855]: 2026-01-20 15:38:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:26 np0005588919 nova_compute[225855]: 2026-01-20 15:38:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:26 np0005588919 nova_compute[225855]: 2026-01-20 15:38:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:27.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:27.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:38:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:38:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:29 np0005588919 nova_compute[225855]: 2026-01-20 15:38:29.027 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:29.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:29.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:29 np0005588919 nova_compute[225855]: 2026-01-20 15:38:29.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:31.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:31.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:31 np0005588919 nova_compute[225855]: 2026-01-20 15:38:31.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:32 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:38:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:33.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:33.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:34 np0005588919 podman[325699]: 2026-01-20 15:38:34.030680716 +0000 UTC m=+0.069098565 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 20 10:38:34 np0005588919 nova_compute[225855]: 2026-01-20 15:38:34.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:34 np0005588919 nova_compute[225855]: 2026-01-20 15:38:34.334 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:35.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:35.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:37.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:37.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.036 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:39.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:39.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.335 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.339 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.340 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.341 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.341 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.342 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.382 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.382 225859 WARNING nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.383 225859 WARNING nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.383 225859 WARNING nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.383 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Removable base files: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.383 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.383 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a4ed0d2b98aa460c005e878d78a49ccb6f511f7c#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.384 225859 INFO nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a522636f3423dd1eea3b834dfd08917146e09c47#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.384 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.384 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 20 10:38:39 np0005588919 nova_compute[225855]: 2026-01-20 15:38:39.384 225859 DEBUG nova.virt.libvirt.imagecache [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 20 10:38:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:41.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:41.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:43.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:43.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:44 np0005588919 nova_compute[225855]: 2026-01-20 15:38:44.039 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:44 np0005588919 nova_compute[225855]: 2026-01-20 15:38:44.336 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:45.041 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:38:45 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:45.042 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:38:45 np0005588919 nova_compute[225855]: 2026-01-20 15:38:45.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:45.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:45.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:47.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:49 np0005588919 nova_compute[225855]: 2026-01-20 15:38:49.042 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:49.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:49 np0005588919 nova_compute[225855]: 2026-01-20 15:38:49.336 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 20 10:38:50 np0005588919 radosgw[83787]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 20 10:38:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:51.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:53 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:38:53.044 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:38:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:53.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:53.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:54 np0005588919 nova_compute[225855]: 2026-01-20 15:38:54.046 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:54 np0005588919 nova_compute[225855]: 2026-01-20 15:38:54.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:55.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:38:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:55.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:38:56 np0005588919 podman[325780]: 2026-01-20 15:38:56.031626296 +0000 UTC m=+0.082442861 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:38:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:38:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:57.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:38:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:38:59 np0005588919 nova_compute[225855]: 2026-01-20 15:38:59.050 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:38:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:38:59.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:38:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:38:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:38:59.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:38:59 np0005588919 nova_compute[225855]: 2026-01-20 15:38:59.382 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:01.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:03.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:03.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:04 np0005588919 nova_compute[225855]: 2026-01-20 15:39:04.054 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:04 np0005588919 nova_compute[225855]: 2026-01-20 15:39:04.384 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:05 np0005588919 podman[325810]: 2026-01-20 15:39:05.010379643 +0000 UTC m=+0.058121924 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 20 10:39:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:05.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:05.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:07.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:09 np0005588919 nova_compute[225855]: 2026-01-20 15:39:09.058 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:09.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:09 np0005588919 nova_compute[225855]: 2026-01-20 15:39:09.385 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:11.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:11.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:13.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:13.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:14 np0005588919 nova_compute[225855]: 2026-01-20 15:39:14.061 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:14 np0005588919 nova_compute[225855]: 2026-01-20 15:39:14.388 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:15.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:15.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:16.460 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:16.461 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:16.461 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:17.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:17.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.064 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:19.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:19.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.385 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.386 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.386 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.390 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:19 np0005588919 nova_compute[225855]: 2026-01-20 15:39:19.411 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:39:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:21 np0005588919 nova_compute[225855]: 2026-01-20 15:39:21.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:21 np0005588919 nova_compute[225855]: 2026-01-20 15:39:21.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:39:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:21.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:23.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:23.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.068 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.361 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.361 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.392 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:39:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1540194721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.791 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.931 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.932 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4289MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.932 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.933 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.991 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:39:24 np0005588919 nova_compute[225855]: 2026-01-20 15:39:24.991 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.011 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.042 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.042 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.061 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.105 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.135 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:39:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:25.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:39:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/312578530' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.571 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.577 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.594 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.596 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:39:25 np0005588919 nova_compute[225855]: 2026-01-20 15:39:25.596 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:39:26 np0005588919 nova_compute[225855]: 2026-01-20 15:39:26.596 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:26 np0005588919 nova_compute[225855]: 2026-01-20 15:39:26.597 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:26 np0005588919 nova_compute[225855]: 2026-01-20 15:39:26.597 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:27 np0005588919 podman[325934]: 2026-01-20 15:39:27.015636116 +0000 UTC m=+0.068701313 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:39:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:27.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:27 np0005588919 nova_compute[225855]: 2026-01-20 15:39:27.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:27 np0005588919 nova_compute[225855]: 2026-01-20 15:39:27.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:27.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:29 np0005588919 nova_compute[225855]: 2026-01-20 15:39:29.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:29.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:29.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:29 np0005588919 nova_compute[225855]: 2026-01-20 15:39:29.394 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:31.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:31.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:33.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:33 np0005588919 nova_compute[225855]: 2026-01-20 15:39:33.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:33 np0005588919 nova_compute[225855]: 2026-01-20 15:39:33.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:33.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:34 np0005588919 nova_compute[225855]: 2026-01-20 15:39:34.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:34 np0005588919 nova_compute[225855]: 2026-01-20 15:39:34.409 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:39:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:39:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:35.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:35.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:36 np0005588919 podman[326148]: 2026-01-20 15:39:36.001634977 +0000 UTC m=+0.048736538 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:39:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:37.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:37.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:39 np0005588919 nova_compute[225855]: 2026-01-20 15:39:39.102 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:39.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:39.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:39 np0005588919 nova_compute[225855]: 2026-01-20 15:39:39.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:39:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:41.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:41.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:43.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:43.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:44 np0005588919 nova_compute[225855]: 2026-01-20 15:39:44.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:44 np0005588919 nova_compute[225855]: 2026-01-20 15:39:44.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:45.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:45.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:46.957 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:39:46 np0005588919 nova_compute[225855]: 2026-01-20 15:39:46.959 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:46 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:46.959 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:39:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:47.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:47 np0005588919 nova_compute[225855]: 2026-01-20 15:39:47.389 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:47 np0005588919 nova_compute[225855]: 2026-01-20 15:39:47.389 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:39:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:47.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:48 np0005588919 nova_compute[225855]: 2026-01-20 15:39:48.350 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.475593) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588475631, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 1261, "num_deletes": 255, "total_data_size": 2798984, "memory_usage": 2841408, "flush_reason": "Manual Compaction"}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588513978, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 1825659, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 84540, "largest_seqno": 85796, "table_properties": {"data_size": 1820203, "index_size": 2851, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11654, "raw_average_key_size": 19, "raw_value_size": 1809208, "raw_average_value_size": 3025, "num_data_blocks": 127, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923485, "oldest_key_time": 1768923485, "file_creation_time": 1768923588, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 38428 microseconds, and 4449 cpu microseconds.
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.514018) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 1825659 bytes OK
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.514038) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516518) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516533) EVENT_LOG_v1 {"time_micros": 1768923588516528, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.516551) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 2792962, prev total WAL file size 2792962, number of live WAL files 2.
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.517282) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323733' seq:72057594037927935, type:22 .. '6C6F676D0033353234' seq:0, type:0; will stop at (end)
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(1782KB)], [174(10MB)]
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588517309, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 12699563, "oldest_snapshot_seqno": -1}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10483 keys, 12578541 bytes, temperature: kUnknown
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588640020, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 12578541, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512471, "index_size": 38757, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26245, "raw_key_size": 277412, "raw_average_key_size": 26, "raw_value_size": 12330508, "raw_average_value_size": 1176, "num_data_blocks": 1471, "num_entries": 10483, "num_filter_entries": 10483, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923588, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.640385) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 12578541 bytes
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.641842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.4 rd, 102.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.4 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.8) write-amplify(6.9) OK, records in: 11006, records dropped: 523 output_compression: NoCompression
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.641897) EVENT_LOG_v1 {"time_micros": 1768923588641856, "job": 112, "event": "compaction_finished", "compaction_time_micros": 122811, "compaction_time_cpu_micros": 29671, "output_level": 6, "num_output_files": 1, "total_output_size": 12578541, "num_input_records": 11006, "num_output_records": 10483, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588642518, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923588645201, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.517237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:39:48.645295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:39:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:49 np0005588919 nova_compute[225855]: 2026-01-20 15:39:49.147 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:49.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:49.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:49 np0005588919 nova_compute[225855]: 2026-01-20 15:39:49.415 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:51.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:51.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:53.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:54 np0005588919 nova_compute[225855]: 2026-01-20 15:39:54.150 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:54 np0005588919 nova_compute[225855]: 2026-01-20 15:39:54.417 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:55.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:55.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:55 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:39:55.961 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:39:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:39:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:57.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:39:58 np0005588919 podman[326278]: 2026-01-20 15:39:58.111817522 +0000 UTC m=+0.150243426 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:39:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:39:59 np0005588919 nova_compute[225855]: 2026-01-20 15:39:59.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:39:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:39:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:39:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:39:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:39:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:39:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:39:59.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:39:59 np0005588919 nova_compute[225855]: 2026-01-20 15:39:59.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:40:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:01.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:01 np0005588919 nova_compute[225855]: 2026-01-20 15:40:01.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:01 np0005588919 nova_compute[225855]: 2026-01-20 15:40:01.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:40:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:01.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:01 np0005588919 nova_compute[225855]: 2026-01-20 15:40:01.466 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:40:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:03.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:03.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:04 np0005588919 nova_compute[225855]: 2026-01-20 15:40:04.241 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:04 np0005588919 nova_compute[225855]: 2026-01-20 15:40:04.422 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:05.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:05.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:06 np0005588919 podman[326308]: 2026-01-20 15:40:06.601792832 +0000 UTC m=+0.070345329 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:40:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:07.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:07.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:09 np0005588919 nova_compute[225855]: 2026-01-20 15:40:09.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:09.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:09.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:09 np0005588919 nova_compute[225855]: 2026-01-20 15:40:09.423 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:11.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:11.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:13 np0005588919 nova_compute[225855]: 2026-01-20 15:40:13.015 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:13.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:13.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:14 np0005588919 nova_compute[225855]: 2026-01-20 15:40:14.293 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:14 np0005588919 nova_compute[225855]: 2026-01-20 15:40:14.426 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:15.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:15.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:16.461 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:16.462 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:16.462 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:17.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:17.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:19 np0005588919 nova_compute[225855]: 2026-01-20 15:40:19.296 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:19 np0005588919 nova_compute[225855]: 2026-01-20 15:40:19.428 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:19.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:21.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:40:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:21.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.582 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.582 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:21 np0005588919 nova_compute[225855]: 2026-01-20 15:40:21.582 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:40:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:23.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:23.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.336 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.360 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.360 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.361 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.430 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:40:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/415358669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.832 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.983 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.984 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4287MB free_disk=20.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.985 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:24 np0005588919 nova_compute[225855]: 2026-01-20 15:40:24.985 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.055 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.056 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.075 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:25.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:25.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:40:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2103467019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.555 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.562 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.760 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.763 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:40:25 np0005588919 nova_compute[225855]: 2026-01-20 15:40:25.763 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:26 np0005588919 nova_compute[225855]: 2026-01-20 15:40:26.764 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:27.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:27.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:28 np0005588919 nova_compute[225855]: 2026-01-20 15:40:28.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:28 np0005588919 nova_compute[225855]: 2026-01-20 15:40:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:28 np0005588919 nova_compute[225855]: 2026-01-20 15:40:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:28 np0005588919 nova_compute[225855]: 2026-01-20 15:40:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:29 np0005588919 podman[326434]: 2026-01-20 15:40:29.08117587 +0000 UTC m=+0.113388135 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:40:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:29.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:29 np0005588919 nova_compute[225855]: 2026-01-20 15:40:29.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:29 np0005588919 nova_compute[225855]: 2026-01-20 15:40:29.431 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:29.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:31.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:31.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.378 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.379 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.399 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.476 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.476 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.482 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.483 225859 INFO nova.compute.claims [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:40:32 np0005588919 nova_compute[225855]: 2026-01-20 15:40:32.719 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:40:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/522043476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.189 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.195 225859 DEBUG nova.compute.provider_tree [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.251 225859 DEBUG nova.scheduler.client.report [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.277 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.278 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:40:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:33.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.376 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.376 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.398 225859 INFO nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.421 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:40:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:33.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.531 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.533 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.533 225859 INFO nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Creating image(s)#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.561 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.592 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.621 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.625 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.688 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.689 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.690 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.690 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.720 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.727 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eae4f8ad-34d0-4893-b039-a371c87ba22e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:33 np0005588919 nova_compute[225855]: 2026-01-20 15:40:33.764 225859 DEBUG nova.policy [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8b010c120d8488bb889b23fb6abfc7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '124217db76ec4d598d94591670b51957', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:40:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.026 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 eae4f8ad-34d0-4893-b039-a371c87ba22e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.099 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] resizing rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.201 225859 DEBUG nova.objects.instance [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lazy-loading 'migration_context' on Instance uuid eae4f8ad-34d0-4893-b039-a371c87ba22e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.267 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.267 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Ensure instance console log exists: /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.267 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.268 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.268 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.341 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:34 np0005588919 nova_compute[225855]: 2026-01-20 15:40:34.434 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:35.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:35 np0005588919 nova_compute[225855]: 2026-01-20 15:40:35.364 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Successfully created port: 2f6f66d9-264a-4c11-ba21-8cef740517bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:40:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:35.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:37 np0005588919 podman[326702]: 2026-01-20 15:40:37.052152101 +0000 UTC m=+0.091878797 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:40:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:37.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:37 np0005588919 nova_compute[225855]: 2026-01-20 15:40:37.412 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Successfully updated port: 2f6f66d9-264a-4c11-ba21-8cef740517bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:40:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:37.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:37 np0005588919 nova_compute[225855]: 2026-01-20 15:40:37.914 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:40:37 np0005588919 nova_compute[225855]: 2026-01-20 15:40:37.914 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquired lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:40:37 np0005588919 nova_compute[225855]: 2026-01-20 15:40:37.914 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:40:38 np0005588919 nova_compute[225855]: 2026-01-20 15:40:38.162 225859 DEBUG nova.compute.manager [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:40:38 np0005588919 nova_compute[225855]: 2026-01-20 15:40:38.162 225859 DEBUG nova.compute.manager [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing instance network info cache due to event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:40:38 np0005588919 nova_compute[225855]: 2026-01-20 15:40:38.163 225859 DEBUG oslo_concurrency.lockutils [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:40:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:38 np0005588919 nova_compute[225855]: 2026-01-20 15:40:38.998 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:40:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:39.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:39 np0005588919 nova_compute[225855]: 2026-01-20 15:40:39.345 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:39 np0005588919 nova_compute[225855]: 2026-01-20 15:40:39.436 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:39.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:41.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:40:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:41.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:41 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.048 225859 DEBUG nova.network.neutron [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updating instance_info_cache with network_info: [{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.090 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Releasing lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.091 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Instance network_info: |[{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.091 225859 DEBUG oslo_concurrency.lockutils [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.091 225859 DEBUG nova.network.neutron [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.094 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Start _get_guest_xml network_info=[{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.098 225859 WARNING nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.123 225859 DEBUG nova.virt.libvirt.host [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.124 225859 DEBUG nova.virt.libvirt.host [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.128 225859 DEBUG nova.virt.libvirt.host [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.128 225859 DEBUG nova.virt.libvirt.host [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.130 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.130 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.130 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.131 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.131 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.131 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.131 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.132 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.132 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.132 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.132 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.133 225859 DEBUG nova.virt.hardware [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.136 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:40:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4232303621' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.577 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.613 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:42 np0005588919 nova_compute[225855]: 2026-01-20 15:40:42.618 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:40:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3309998384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.089 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.090 225859 DEBUG nova.virt.libvirt.vif [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1131874044-ac',id=217,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBI8BHGvvVMk8YauPGCgxR3A4v1f0rWghzxaRGrntFPvXTTSIDvApqOQBjhgH6T1AnKeJJSHdUCD1AT2JA7XK7b8l6gUg2nLAeLoR1LsiUcTGAlR1hX0RRwuXqUe5lWZMg==',key_name='tempest-TestSecurityGroupsBasicOps-32974910',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='124217db76ec4d598d94591670b51957',ramdisk_id='',reservation_id='r-egebxdzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1131874044',owner_user_name='tempest-TestSecurityGroupsBasicOps-1131874044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:40:33Z,user_data=None,user_id='a8b010c120d8488bb889b23fb6abfc7f',uuid=eae4f8ad-34d0-4893-b039-a371c87ba22e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.091 225859 DEBUG nova.network.os_vif_util [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converting VIF {"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.092 225859 DEBUG nova.network.os_vif_util [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.093 225859 DEBUG nova.objects.instance [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lazy-loading 'pci_devices' on Instance uuid eae4f8ad-34d0-4893-b039-a371c87ba22e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.122 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <uuid>eae4f8ad-34d0-4893-b039-a371c87ba22e</uuid>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <name>instance-000000d9</name>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062</nova:name>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:40:42</nova:creationTime>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:user uuid="a8b010c120d8488bb889b23fb6abfc7f">tempest-TestSecurityGroupsBasicOps-1131874044-project-member</nova:user>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:project uuid="124217db76ec4d598d94591670b51957">tempest-TestSecurityGroupsBasicOps-1131874044</nova:project>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <nova:port uuid="2f6f66d9-264a-4c11-ba21-8cef740517bf">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="serial">eae4f8ad-34d0-4893-b039-a371c87ba22e</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="uuid">eae4f8ad-34d0-4893-b039-a371c87ba22e</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eae4f8ad-34d0-4893-b039-a371c87ba22e_disk">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:e0:ea:47"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <target dev="tap2f6f66d9-26"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/console.log" append="off"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:40:43 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:40:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:40:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:40:43 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.124 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Preparing to wait for external event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.125 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.125 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.125 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.126 225859 DEBUG nova.virt.libvirt.vif [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1131874044-ac',id=217,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBI8BHGvvVMk8YauPGCgxR3A4v1f0rWghzxaRGrntFPvXTTSIDvApqOQBjhgH6T1AnKeJJSHdUCD1AT2JA7XK7b8l6gUg2nLAeLoR1LsiUcTGAlR1hX0RRwuXqUe5lWZMg==',key_name='tempest-TestSecurityGroupsBasicOps-32974910',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='124217db76ec4d598d94591670b51957',ramdisk_id='',reservation_id='r-egebxdzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1131874044',owner_user_name='tempest-TestSecurityGroupsBasicOps-1131874044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:40:33Z,user_data=None,user_id='a8b010c120d8488bb889b23fb6abfc7f',uuid=eae4f8ad-34d0-4893-b039-a371c87ba22e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.127 225859 DEBUG nova.network.os_vif_util [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converting VIF {"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.127 225859 DEBUG nova.network.os_vif_util [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.128 225859 DEBUG os_vif [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.129 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.129 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.130 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.134 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.134 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f6f66d9-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.135 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f6f66d9-26, col_values=(('external_ids', {'iface-id': '2f6f66d9-264a-4c11-ba21-8cef740517bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:ea:47', 'vm-uuid': 'eae4f8ad-34d0-4893-b039-a371c87ba22e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.136 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:43 np0005588919 NetworkManager[49104]: <info>  [1768923643.1378] manager: (tap2f6f66d9-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.144 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.145 225859 INFO os_vif [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26')#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.272 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.273 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.273 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] No VIF found with MAC fa:16:3e:e0:ea:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.274 225859 INFO nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Using config drive#033[00m
Jan 20 10:40:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:43.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.303 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:43.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.735 225859 INFO nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Creating config drive at /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.745 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp489gev6m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.775 225859 DEBUG nova.network.neutron [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updated VIF entry in instance network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.776 225859 DEBUG nova.network.neutron [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updating instance_info_cache with network_info: [{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.812 225859 DEBUG oslo_concurrency.lockutils [req-4eb5c8a7-e00c-406e-94a8-7c73a07aecb1 req-745af3b2-0e18-40f7-be22-1d50f7b3dcac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:40:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.883 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp489gev6m" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.912 225859 DEBUG nova.storage.rbd_utils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] rbd image eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:40:43 np0005588919 nova_compute[225855]: 2026-01-20 15:40:43.916 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.070 225859 DEBUG oslo_concurrency.processutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config eae4f8ad-34d0-4893-b039-a371c87ba22e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.071 225859 INFO nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Deleting local config drive /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e/disk.config because it was imported into RBD.#033[00m
Jan 20 10:40:44 np0005588919 kernel: tap2f6f66d9-26: entered promiscuous mode
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.1251] manager: (tap2f6f66d9-26): new Tun device (/org/freedesktop/NetworkManager/Devices/421)
Jan 20 10:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:44Z|00981|binding|INFO|Claiming lport 2f6f66d9-264a-4c11-ba21-8cef740517bf for this chassis.
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.124 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:44Z|00982|binding|INFO|2f6f66d9-264a-4c11-ba21-8cef740517bf: Claiming fa:16:3e:e0:ea:47 10.100.0.6
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.128 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.133 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.145 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ea:47 10.100.0.6'], port_security=['fa:16:3e:e0:ea:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eae4f8ad-34d0-4893-b039-a371c87ba22e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '124217db76ec4d598d94591670b51957', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16a3d061-4fac-44b6-8559-a0d83ddd3ce6 9098e587-0497-4bd0-abe2-7bb17bf96b42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca3022e-756e-490f-aed0-0aadecee6965, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2f6f66d9-264a-4c11-ba21-8cef740517bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.147 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2f6f66d9-264a-4c11-ba21-8cef740517bf in datapath 7887623c-0aac-4bfc-b122-61e1bb0418eb bound to our chassis#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.148 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7887623c-0aac-4bfc-b122-61e1bb0418eb#033[00m
Jan 20 10:40:44 np0005588919 systemd-udevd[327110]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:40:44 np0005588919 systemd-machined[194361]: New machine qemu-113-instance-000000d9.
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.160 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb2349d-8248-4fd8-81ad-c690a7515c07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.161 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7887623c-01 in ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.163 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7887623c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.163 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[48131b4e-061f-43aa-a7d3-96b7059feaca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.164 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7c424f-b0f4-4ffd-92d0-e784bae4edc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.1706] device (tap2f6f66d9-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.1722] device (tap2f6f66d9-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.176 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[f43074c4-2401-464e-9fd2-6bcf6fb5b270]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 systemd[1]: Started Virtual Machine qemu-113-instance-000000d9.
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.201 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[84869386-35e3-4e29-ac2a-90e935e442f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.209 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:44Z|00983|binding|INFO|Setting lport 2f6f66d9-264a-4c11-ba21-8cef740517bf ovn-installed in OVS
Jan 20 10:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:44Z|00984|binding|INFO|Setting lport 2f6f66d9-264a-4c11-ba21-8cef740517bf up in Southbound
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.230 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[d212b9c6-87a6-4945-a328-ac074d0f743c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.235 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8b806365-fdb1-486e-8080-d800ec133d04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 systemd-udevd[327114]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.2374] manager: (tap7887623c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/422)
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.266 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[35203a60-984d-4e0c-aff4-dea7a83fc2c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.269 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3dd9db-53cf-45d8-afcf-866173a3624d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.2911] device (tap7887623c-00): carrier: link connected
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.294 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[c312275a-2e55-4d7d-bd61-5c715b725d80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.310 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[57cb103c-b22f-41bc-87cf-fe8ee9337041]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7887623c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:56:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 280], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 868321, 'reachable_time': 23841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327143, 'error': None, 'target': 'ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.328 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf64a52-359f-4377-8d2b-d9a4a37f8fb6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:5695'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 868321, 'tstamp': 868321}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 327144, 'error': None, 'target': 'ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.346 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5ee92d-dc5f-47ef-9e19-ed7d56966776]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7887623c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:56:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 280], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 868321, 'reachable_time': 23841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 327145, 'error': None, 'target': 'ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.378 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa2838f-d8b3-4911-99c6-2d5b9ea114c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.435 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a00a56ba-fc6b-4297-97e4-446c9171a955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.436 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7887623c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.437 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.438 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.438 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7887623c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:44 np0005588919 kernel: tap7887623c-00: entered promiscuous mode
Jan 20 10:40:44 np0005588919 NetworkManager[49104]: <info>  [1768923644.4405] manager: (tap7887623c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/423)
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.442 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7887623c-00, col_values=(('external_ids', {'iface-id': 'b64ac449-7e2b-4185-951f-151c787a165d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.443 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:44Z|00985|binding|INFO|Releasing lport b64ac449-7e2b-4185-951f-151c787a165d from this chassis (sb_readonly=0)
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.457 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.458 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7887623c-0aac-4bfc-b122-61e1bb0418eb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7887623c-0aac-4bfc-b122-61e1bb0418eb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.459 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[8e185db9-8a44-45cc-8342-fe0c410a9751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.460 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-7887623c-0aac-4bfc-b122-61e1bb0418eb
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/7887623c-0aac-4bfc-b122-61e1bb0418eb.pid.haproxy
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 7887623c-0aac-4bfc-b122-61e1bb0418eb
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:40:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:44.461 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'env', 'PROCESS_TAG=haproxy-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7887623c-0aac-4bfc-b122-61e1bb0418eb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.618 225859 DEBUG nova.compute.manager [req-cca5b4c8-df7a-45ee-a915-1916b6efab6c req-a3c16476-6d51-4e35-ba27-c24f467678cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.619 225859 DEBUG oslo_concurrency.lockutils [req-cca5b4c8-df7a-45ee-a915-1916b6efab6c req-a3c16476-6d51-4e35-ba27-c24f467678cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.619 225859 DEBUG oslo_concurrency.lockutils [req-cca5b4c8-df7a-45ee-a915-1916b6efab6c req-a3c16476-6d51-4e35-ba27-c24f467678cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.619 225859 DEBUG oslo_concurrency.lockutils [req-cca5b4c8-df7a-45ee-a915-1916b6efab6c req-a3c16476-6d51-4e35-ba27-c24f467678cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:44 np0005588919 nova_compute[225855]: 2026-01-20 15:40:44.620 225859 DEBUG nova.compute.manager [req-cca5b4c8-df7a-45ee-a915-1916b6efab6c req-a3c16476-6d51-4e35-ba27-c24f467678cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Processing event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:40:44 np0005588919 podman[327177]: 2026-01-20 15:40:44.849335531 +0000 UTC m=+0.078720665 container create e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:40:44 np0005588919 systemd[1]: Started libpod-conmon-e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d.scope.
Jan 20 10:40:44 np0005588919 podman[327177]: 2026-01-20 15:40:44.80081618 +0000 UTC m=+0.030201344 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:40:44 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:40:44 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a41ca13b23ec89cab9c006d19667afe76dbef3f5eb560ea287f9b1085cdff420/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:40:44 np0005588919 podman[327177]: 2026-01-20 15:40:44.934367065 +0000 UTC m=+0.163752219 container init e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 20 10:40:44 np0005588919 podman[327177]: 2026-01-20 15:40:44.940191029 +0000 UTC m=+0.169576173 container start e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:40:44 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [NOTICE]   (327197) : New worker (327199) forked
Jan 20 10:40:44 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [NOTICE]   (327197) : Loading success.
Jan 20 10:40:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:45.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.066 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.067 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923646.0656903, eae4f8ad-34d0-4893-b039-a371c87ba22e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.067 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] VM Started (Lifecycle Event)#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.070 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.075 225859 INFO nova.virt.libvirt.driver [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Instance spawned successfully.#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.076 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.100 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.104 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.104 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.105 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.105 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.106 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.106 225859 DEBUG nova.virt.libvirt.driver [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.112 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.174 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.175 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923646.06819, eae4f8ad-34d0-4893-b039-a371c87ba22e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.175 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.225 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.230 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923646.0700994, eae4f8ad-34d0-4893-b039-a371c87ba22e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.230 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.250 225859 INFO nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Took 12.72 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.251 225859 DEBUG nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.262 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.266 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.295 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.331 225859 INFO nova.compute.manager [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Took 13.89 seconds to build instance.#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.388 225859 DEBUG oslo_concurrency.lockutils [None req-e3aa03ba-f6bc-4ddf-96ec-477e46733c14 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.716 225859 DEBUG nova.compute.manager [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.716 225859 DEBUG oslo_concurrency.lockutils [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.717 225859 DEBUG oslo_concurrency.lockutils [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.717 225859 DEBUG oslo_concurrency.lockutils [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.717 225859 DEBUG nova.compute.manager [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] No waiting events found dispatching network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:40:46 np0005588919 nova_compute[225855]: 2026-01-20 15:40:46.717 225859 WARNING nova.compute.manager [req-abd76668-deab-48eb-93c6-22b3006e1ffc req-4aa7251c-a823-4108-99e4-b4079e02c1a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received unexpected event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf for instance with vm_state active and task_state None.#033[00m
Jan 20 10:40:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:47.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:47.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:48 np0005588919 nova_compute[225855]: 2026-01-20 15:40:48.138 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:48 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:40:48 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:49.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:49 np0005588919 nova_compute[225855]: 2026-01-20 15:40:49.441 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:49.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:51.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:51.349 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:40:51 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:51.350 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:40:51 np0005588919 nova_compute[225855]: 2026-01-20 15:40:51.349 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:51.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:52 np0005588919 NetworkManager[49104]: <info>  [1768923652.3095] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Jan 20 10:40:52 np0005588919 NetworkManager[49104]: <info>  [1768923652.3114] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Jan 20 10:40:52 np0005588919 nova_compute[225855]: 2026-01-20 15:40:52.308 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:52 np0005588919 nova_compute[225855]: 2026-01-20 15:40:52.380 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:52 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:52Z|00986|binding|INFO|Releasing lport b64ac449-7e2b-4185-951f-151c787a165d from this chassis (sb_readonly=0)
Jan 20 10:40:52 np0005588919 nova_compute[225855]: 2026-01-20 15:40:52.391 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.105 225859 DEBUG nova.compute.manager [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.106 225859 DEBUG nova.compute.manager [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing instance network info cache due to event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.106 225859 DEBUG oslo_concurrency.lockutils [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.106 225859 DEBUG oslo_concurrency.lockutils [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.106 225859 DEBUG nova.network.neutron [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:40:53 np0005588919 nova_compute[225855]: 2026-01-20 15:40:53.182 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:53.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:53.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:53 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:54 np0005588919 nova_compute[225855]: 2026-01-20 15:40:54.444 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:55 np0005588919 nova_compute[225855]: 2026-01-20 15:40:55.025 225859 DEBUG nova.network.neutron [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updated VIF entry in instance network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:40:55 np0005588919 nova_compute[225855]: 2026-01-20 15:40:55.026 225859 DEBUG nova.network.neutron [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updating instance_info_cache with network_info: [{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:40:55 np0005588919 nova_compute[225855]: 2026-01-20 15:40:55.126 225859 DEBUG oslo_concurrency.lockutils [req-7110c713-14c1-492b-b483-520999b9649c req-a44ce692-5bee-4068-a331-73120313c60e 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:40:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:55.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:55.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:40:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:57.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:40:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:57.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:58 np0005588919 nova_compute[225855]: 2026-01-20 15:40:58.185 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:58 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:40:58.353 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:58Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:ea:47 10.100.0.6
Jan 20 10:40:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:40:58Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:ea:47 10.100.0.6
Jan 20 10:40:58 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:40:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:40:59.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:40:59 np0005588919 nova_compute[225855]: 2026-01-20 15:40:59.467 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:40:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:40:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:40:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:40:59.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:00 np0005588919 podman[327362]: 2026-01-20 15:41:00.093954085 +0000 UTC m=+0.135219573 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:41:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:01.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:01.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:03 np0005588919 nova_compute[225855]: 2026-01-20 15:41:03.188 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:03.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:03.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:03 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:04 np0005588919 nova_compute[225855]: 2026-01-20 15:41:04.468 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:05.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:05.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:07.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:07.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:08 np0005588919 podman[327392]: 2026-01-20 15:41:08.021737297 +0000 UTC m=+0.061803728 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:41:08 np0005588919 nova_compute[225855]: 2026-01-20 15:41:08.249 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:09.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:09 np0005588919 nova_compute[225855]: 2026-01-20 15:41:09.470 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:09.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:11.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:11.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:13 np0005588919 nova_compute[225855]: 2026-01-20 15:41:13.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:13.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:13.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:41:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3873927125' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:41:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:41:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3873927125' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:41:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:14 np0005588919 nova_compute[225855]: 2026-01-20 15:41:14.471 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.277 225859 DEBUG nova.compute.manager [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.278 225859 DEBUG nova.compute.manager [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing instance network info cache due to event network-changed-2f6f66d9-264a-4c11-ba21-8cef740517bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.278 225859 DEBUG oslo_concurrency.lockutils [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.278 225859 DEBUG oslo_concurrency.lockutils [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.279 225859 DEBUG nova.network.neutron [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Refreshing network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:41:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:15.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.341 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.342 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.342 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.342 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.343 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.344 225859 INFO nova.compute.manager [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Terminating instance#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.345 225859 DEBUG nova.compute.manager [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:41:15 np0005588919 kernel: tap2f6f66d9-26 (unregistering): left promiscuous mode
Jan 20 10:41:15 np0005588919 NetworkManager[49104]: <info>  [1768923675.3903] device (tap2f6f66d9-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:41:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:41:15Z|00987|binding|INFO|Releasing lport 2f6f66d9-264a-4c11-ba21-8cef740517bf from this chassis (sb_readonly=0)
Jan 20 10:41:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:41:15Z|00988|binding|INFO|Setting lport 2f6f66d9-264a-4c11-ba21-8cef740517bf down in Southbound
Jan 20 10:41:15 np0005588919 ovn_controller[130490]: 2026-01-20T15:41:15Z|00989|binding|INFO|Removing iface tap2f6f66d9-26 ovn-installed in OVS
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.399 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.406 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ea:47 10.100.0.6'], port_security=['fa:16:3e:e0:ea:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eae4f8ad-34d0-4893-b039-a371c87ba22e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '124217db76ec4d598d94591670b51957', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16a3d061-4fac-44b6-8559-a0d83ddd3ce6 9098e587-0497-4bd0-abe2-7bb17bf96b42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca3022e-756e-490f-aed0-0aadecee6965, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2f6f66d9-264a-4c11-ba21-8cef740517bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.408 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2f6f66d9-264a-4c11-ba21-8cef740517bf in datapath 7887623c-0aac-4bfc-b122-61e1bb0418eb unbound from our chassis#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.409 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7887623c-0aac-4bfc-b122-61e1bb0418eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.410 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[15c3a541-9ab5-4154-bba5-225d74ebc3a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.410 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb namespace which is not needed anymore#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.416 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 systemd[1]: machine-qemu\x2d113\x2dinstance\x2d000000d9.scope: Deactivated successfully.
Jan 20 10:41:15 np0005588919 systemd[1]: machine-qemu\x2d113\x2dinstance\x2d000000d9.scope: Consumed 14.941s CPU time.
Jan 20 10:41:15 np0005588919 systemd-machined[194361]: Machine qemu-113-instance-000000d9 terminated.
Jan 20 10:41:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:15.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:15 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [NOTICE]   (327197) : haproxy version is 2.8.14-c23fe91
Jan 20 10:41:15 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [NOTICE]   (327197) : path to executable is /usr/sbin/haproxy
Jan 20 10:41:15 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [WARNING]  (327197) : Exiting Master process...
Jan 20 10:41:15 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [ALERT]    (327197) : Current worker (327199) exited with code 143 (Terminated)
Jan 20 10:41:15 np0005588919 neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb[327193]: [WARNING]  (327197) : All workers exited. Exiting... (0)
Jan 20 10:41:15 np0005588919 systemd[1]: libpod-e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d.scope: Deactivated successfully.
Jan 20 10:41:15 np0005588919 podman[327487]: 2026-01-20 15:41:15.540979271 +0000 UTC m=+0.048560523 container died e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 20 10:41:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d-userdata-shm.mount: Deactivated successfully.
Jan 20 10:41:15 np0005588919 systemd[1]: var-lib-containers-storage-overlay-a41ca13b23ec89cab9c006d19667afe76dbef3f5eb560ea287f9b1085cdff420-merged.mount: Deactivated successfully.
Jan 20 10:41:15 np0005588919 podman[327487]: 2026-01-20 15:41:15.594288988 +0000 UTC m=+0.101870220 container cleanup e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.592 225859 INFO nova.virt.libvirt.driver [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Instance destroyed successfully.#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.593 225859 DEBUG nova.objects.instance [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lazy-loading 'resources' on Instance uuid eae4f8ad-34d0-4893-b039-a371c87ba22e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:41:15 np0005588919 systemd[1]: libpod-conmon-e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d.scope: Deactivated successfully.
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.612 225859 DEBUG nova.virt.libvirt.vif [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1131874044-access_point-243412062',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1131874044-ac',id=217,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBI8BHGvvVMk8YauPGCgxR3A4v1f0rWghzxaRGrntFPvXTTSIDvApqOQBjhgH6T1AnKeJJSHdUCD1AT2JA7XK7b8l6gUg2nLAeLoR1LsiUcTGAlR1hX0RRwuXqUe5lWZMg==',key_name='tempest-TestSecurityGroupsBasicOps-32974910',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:40:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='124217db76ec4d598d94591670b51957',ramdisk_id='',reservation_id='r-egebxdzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1131874044',owner_user_name='tempest-TestSecurityGroupsBasicOps-1131874044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:40:46Z,user_data=None,user_id='a8b010c120d8488bb889b23fb6abfc7f',uuid=eae4f8ad-34d0-4893-b039-a371c87ba22e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.613 225859 DEBUG nova.network.os_vif_util [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converting VIF {"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.613 225859 DEBUG nova.network.os_vif_util [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.614 225859 DEBUG os_vif [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.615 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.616 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f6f66d9-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.617 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.623 225859 INFO os_vif [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:ea:47,bridge_name='br-int',has_traffic_filtering=True,id=2f6f66d9-264a-4c11-ba21-8cef740517bf,network=Network(7887623c-0aac-4bfc-b122-61e1bb0418eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f6f66d9-26')#033[00m
Jan 20 10:41:15 np0005588919 podman[327526]: 2026-01-20 15:41:15.666409086 +0000 UTC m=+0.045435225 container remove e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.676 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[22d578ef-9b0b-4725-bfe1-58fd0bbaa213]: (4, ('Tue Jan 20 03:41:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb (e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d)\ne50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d\nTue Jan 20 03:41:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb (e50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d)\ne50f16a5fb9f5df17d9e12509c39a365ee2814f233d00bebc285b4f3f143706d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.678 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e5db5f8c-ab71-4043-858a-5431aad116b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.679 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7887623c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:41:15 np0005588919 kernel: tap7887623c-00: left promiscuous mode
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.681 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.683 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.686 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9018f9-ae31-452f-9953-6ef34e5c7893]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 nova_compute[225855]: 2026-01-20 15:41:15.696 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.702 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[b66335df-0bc2-4956-a1bf-3f6d929f82fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.703 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[f86b4785-917a-4de0-a803-d1093c8315c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.719 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[93df383c-28a0-4359-a55b-ed0b67f6f73b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 868315, 'reachable_time': 38139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327557, 'error': None, 'target': 'ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:15 np0005588919 systemd[1]: run-netns-ovnmeta\x2d7887623c\x2d0aac\x2d4bfc\x2db122\x2d61e1bb0418eb.mount: Deactivated successfully.
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.724 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7887623c-0aac-4bfc-b122-61e1bb0418eb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:41:15 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:15.724 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[564e33be-ad80-4d63-8a9a-88684a79cb0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.018 225859 INFO nova.virt.libvirt.driver [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Deleting instance files /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e_del#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.019 225859 INFO nova.virt.libvirt.driver [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Deletion of /var/lib/nova/instances/eae4f8ad-34d0-4893-b039-a371c87ba22e_del complete#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.103 225859 INFO nova.compute.manager [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.103 225859 DEBUG oslo.service.loopingcall [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.104 225859 DEBUG nova.compute.manager [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:41:16 np0005588919 nova_compute[225855]: 2026-01-20 15:41:16.104 225859 DEBUG nova.network.neutron [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:16.462 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:16.463 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:16.463 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.279 225859 DEBUG nova.network.neutron [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.304 225859 INFO nova.compute.manager [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Took 1.20 seconds to deallocate network for instance.#033[00m
Jan 20 10:41:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:17.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.358 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.358 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.409 225859 DEBUG nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-vif-unplugged-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.409 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.409 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.410 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.410 225859 DEBUG nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] No waiting events found dispatching network-vif-unplugged-2f6f66d9-264a-4c11-ba21-8cef740517bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.410 225859 WARNING nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received unexpected event network-vif-unplugged-2f6f66d9-264a-4c11-ba21-8cef740517bf for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.411 225859 DEBUG nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.411 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.411 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.411 225859 DEBUG oslo_concurrency.lockutils [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.412 225859 DEBUG nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] No waiting events found dispatching network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.412 225859 WARNING nova.compute.manager [req-4bf7d939-cd60-4a30-8dc0-9e368e129012 req-1c929245-1c97-4143-bcbc-807eb7ff56a9 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received unexpected event network-vif-plugged-2f6f66d9-264a-4c11-ba21-8cef740517bf for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.443 225859 DEBUG nova.compute.manager [req-1b3fc75f-dcfb-4dc0-b35c-83dc81ea76fe req-bbbd344c-bc9b-4b70-9729-21d0a7b62aac 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Received event network-vif-deleted-2f6f66d9-264a-4c11-ba21-8cef740517bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.465 225859 DEBUG oslo_concurrency.processutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.498 225859 DEBUG nova.network.neutron [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updated VIF entry in instance network info cache for port 2f6f66d9-264a-4c11-ba21-8cef740517bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.500 225859 DEBUG nova.network.neutron [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Updating instance_info_cache with network_info: [{"id": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "address": "fa:16:3e:e0:ea:47", "network": {"id": "7887623c-0aac-4bfc-b122-61e1bb0418eb", "bridge": "br-int", "label": "tempest-network-smoke--1885521634", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "124217db76ec4d598d94591670b51957", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f6f66d9-26", "ovs_interfaceid": "2f6f66d9-264a-4c11-ba21-8cef740517bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:41:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:17.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.525 225859 DEBUG oslo_concurrency.lockutils [req-8caac3aa-6d96-4dc6-bcce-12672339a92d req-f75162f6-4773-4072-8fd5-455289e8f644 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-eae4f8ad-34d0-4893-b039-a371c87ba22e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:41:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:41:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1600213521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.938 225859 DEBUG oslo_concurrency.processutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.945 225859 DEBUG nova.compute.provider_tree [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.968 225859 DEBUG nova.scheduler.client.report [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:41:17 np0005588919 nova_compute[225855]: 2026-01-20 15:41:17.993 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:18 np0005588919 nova_compute[225855]: 2026-01-20 15:41:18.026 225859 INFO nova.scheduler.client.report [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Deleted allocations for instance eae4f8ad-34d0-4893-b039-a371c87ba22e#033[00m
Jan 20 10:41:18 np0005588919 nova_compute[225855]: 2026-01-20 15:41:18.106 225859 DEBUG oslo_concurrency.lockutils [None req-2d8dd99a-9c66-4120-8ec7-3958f0ca8179 a8b010c120d8488bb889b23fb6abfc7f 124217db76ec4d598d94591670b51957 - - default default] Lock "eae4f8ad-34d0-4893-b039-a371c87ba22e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:19.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:19 np0005588919 nova_compute[225855]: 2026-01-20 15:41:19.473 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:19.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:20 np0005588919 nova_compute[225855]: 2026-01-20 15:41:20.619 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:21.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:21.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.363 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.365 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:22 np0005588919 nova_compute[225855]: 2026-01-20 15:41:22.365 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:41:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:23.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:23.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:24 np0005588919 nova_compute[225855]: 2026-01-20 15:41:24.475 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:24 np0005588919 nova_compute[225855]: 2026-01-20 15:41:24.535 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:24 np0005588919 nova_compute[225855]: 2026-01-20 15:41:24.650 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:25.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.376 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:41:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:25.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.622 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:25 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:41:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3545405961' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.798 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.960 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.961 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4227MB free_disk=20.94268798828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.961 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:41:25 np0005588919 nova_compute[225855]: 2026-01-20 15:41:25.962 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.062 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.062 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.118 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:41:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:41:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3284217090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.554 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.560 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.577 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.605 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:41:26 np0005588919 nova_compute[225855]: 2026-01-20 15:41:26.605 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:41:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:27.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:27.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:27 np0005588919 nova_compute[225855]: 2026-01-20 15:41:27.606 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:28 np0005588919 nova_compute[225855]: 2026-01-20 15:41:28.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:28 np0005588919 nova_compute[225855]: 2026-01-20 15:41:28.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:29 np0005588919 nova_compute[225855]: 2026-01-20 15:41:29.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:29.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:29 np0005588919 nova_compute[225855]: 2026-01-20 15:41:29.476 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:29.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:30 np0005588919 nova_compute[225855]: 2026-01-20 15:41:30.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:30 np0005588919 nova_compute[225855]: 2026-01-20 15:41:30.589 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923675.5871499, eae4f8ad-34d0-4893-b039-a371c87ba22e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:41:30 np0005588919 nova_compute[225855]: 2026-01-20 15:41:30.589 225859 INFO nova.compute.manager [-] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:41:30 np0005588919 nova_compute[225855]: 2026-01-20 15:41:30.615 225859 DEBUG nova.compute.manager [None req-82719537-6f0b-47e1-849b-fd851d759790 - - - - - -] [instance: eae4f8ad-34d0-4893-b039-a371c87ba22e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:41:30 np0005588919 nova_compute[225855]: 2026-01-20 15:41:30.624 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:31 np0005588919 podman[327637]: 2026-01-20 15:41:31.125010726 +0000 UTC m=+0.156690710 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 20 10:41:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:31.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:31.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:33 np0005588919 nova_compute[225855]: 2026-01-20 15:41:33.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:33.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:34 np0005588919 nova_compute[225855]: 2026-01-20 15:41:34.513 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:35.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:35.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:35 np0005588919 nova_compute[225855]: 2026-01-20 15:41:35.673 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:37.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:37.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:39 np0005588919 podman[327717]: 2026-01-20 15:41:39.155074147 +0000 UTC m=+0.204287544 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 20 10:41:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:39 np0005588919 nova_compute[225855]: 2026-01-20 15:41:39.516 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:39.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:40 np0005588919 nova_compute[225855]: 2026-01-20 15:41:40.677 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.521088) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701521251, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1347, "num_deletes": 251, "total_data_size": 3039612, "memory_usage": 3068128, "flush_reason": "Manual Compaction"}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701539809, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 2006340, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85801, "largest_seqno": 87143, "table_properties": {"data_size": 2000478, "index_size": 3192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12603, "raw_average_key_size": 20, "raw_value_size": 1988788, "raw_average_value_size": 3176, "num_data_blocks": 139, "num_entries": 626, "num_filter_entries": 626, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923588, "oldest_key_time": 1768923588, "file_creation_time": 1768923701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 18839 microseconds, and 10893 cpu microseconds.
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.539927) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 2006340 bytes OK
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.539962) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.541336) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.541353) EVENT_LOG_v1 {"time_micros": 1768923701541347, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.541397) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 3033280, prev total WAL file size 3033280, number of live WAL files 2.
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.542535) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(1959KB)], [177(11MB)]
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701542578, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 14584881, "oldest_snapshot_seqno": -1}
Jan 20 10:41:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:41.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10590 keys, 12624895 bytes, temperature: kUnknown
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701617448, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 12624895, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12558089, "index_size": 39237, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26501, "raw_key_size": 280337, "raw_average_key_size": 26, "raw_value_size": 12374248, "raw_average_value_size": 1168, "num_data_blocks": 1485, "num_entries": 10590, "num_filter_entries": 10590, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.617859) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 12624895 bytes
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619939) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.5 rd, 168.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.0 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 11109, records dropped: 519 output_compression: NoCompression
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.619974) EVENT_LOG_v1 {"time_micros": 1768923701619958, "job": 114, "event": "compaction_finished", "compaction_time_micros": 74994, "compaction_time_cpu_micros": 42643, "output_level": 6, "num_output_files": 1, "total_output_size": 12624895, "num_input_records": 11109, "num_output_records": 10590, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701620789, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923701625140, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.542420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.625239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.625245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.625247) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.625248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:41 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:41:41.625250) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:41:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:43.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:43.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:44 np0005588919 nova_compute[225855]: 2026-01-20 15:41:44.518 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:45.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:45.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:45 np0005588919 nova_compute[225855]: 2026-01-20 15:41:45.680 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:47.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:47.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:49 np0005588919 nova_compute[225855]: 2026-01-20 15:41:49.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:49.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:41:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:41:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:41:50 np0005588919 nova_compute[225855]: 2026-01-20 15:41:50.683 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:51.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:41:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:51.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:41:53 np0005588919 nova_compute[225855]: 2026-01-20 15:41:53.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:41:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:41:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:53.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:41:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:53.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:54 np0005588919 nova_compute[225855]: 2026-01-20 15:41:54.523 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:55.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:55.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:55 np0005588919 nova_compute[225855]: 2026-01-20 15:41:55.685 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:55 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:41:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:57.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:57.542 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:41:57 np0005588919 nova_compute[225855]: 2026-01-20 15:41:57.543 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:57 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:57.543 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:41:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:57.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:41:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:41:59.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:41:59 np0005588919 nova_compute[225855]: 2026-01-20 15:41:59.525 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:41:59 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:41:59.545 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:41:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:41:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:41:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:41:59.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:00 np0005588919 nova_compute[225855]: 2026-01-20 15:42:00.687 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:01.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:01.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:02 np0005588919 podman[327982]: 2026-01-20 15:42:02.051690118 +0000 UTC m=+0.103135155 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 10:42:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:03.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:03.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:04 np0005588919 nova_compute[225855]: 2026-01-20 15:42:04.558 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:05.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:05.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:05 np0005588919 nova_compute[225855]: 2026-01-20 15:42:05.726 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:07.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:07.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:09.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:09 np0005588919 nova_compute[225855]: 2026-01-20 15:42:09.559 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:09.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:10 np0005588919 podman[328012]: 2026-01-20 15:42:10.027135195 +0000 UTC m=+0.062284782 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 20 10:42:10 np0005588919 nova_compute[225855]: 2026-01-20 15:42:10.728 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:11.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:11.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:12 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:12Z|00990|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 20 10:42:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:13.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:13.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:14 np0005588919 nova_compute[225855]: 2026-01-20 15:42:14.562 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:15.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:15.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:15 np0005588919 nova_compute[225855]: 2026-01-20 15:42:15.730 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:16.463 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:16.463 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:16.463 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:17.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:17.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:19.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:19 np0005588919 nova_compute[225855]: 2026-01-20 15:42:19.564 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:19.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:20 np0005588919 nova_compute[225855]: 2026-01-20 15:42:20.732 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:21.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:21.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:23.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:23.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.352 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.353 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.353 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:42:24 np0005588919 nova_compute[225855]: 2026-01-20 15:42:24.564 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:25.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:25.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:25 np0005588919 nova_compute[225855]: 2026-01-20 15:42:25.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.376 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.376 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.376 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:27.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:27.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:42:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1965861902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.799 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.964 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.966 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4269MB free_disk=20.942886352539062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.966 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:27 np0005588919 nova_compute[225855]: 2026-01-20 15:42:27.967 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.031 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.031 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.122 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:42:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3529525207' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.552 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.559 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.587 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.588 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:42:28 np0005588919 nova_compute[225855]: 2026-01-20 15:42:28.588 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:29.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:29 np0005588919 nova_compute[225855]: 2026-01-20 15:42:29.566 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:29 np0005588919 nova_compute[225855]: 2026-01-20 15:42:29.587 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:29 np0005588919 nova_compute[225855]: 2026-01-20 15:42:29.588 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:29 np0005588919 nova_compute[225855]: 2026-01-20 15:42:29.588 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:29.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:30 np0005588919 nova_compute[225855]: 2026-01-20 15:42:30.737 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:31 np0005588919 nova_compute[225855]: 2026-01-20 15:42:31.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:31.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:31.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:32 np0005588919 nova_compute[225855]: 2026-01-20 15:42:32.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:33 np0005588919 podman[328138]: 2026-01-20 15:42:33.054383798 +0000 UTC m=+0.098752682 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:42:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:33.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:33.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:34 np0005588919 nova_compute[225855]: 2026-01-20 15:42:34.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:42:34 np0005588919 nova_compute[225855]: 2026-01-20 15:42:34.567 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:35.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:35.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:35 np0005588919 nova_compute[225855]: 2026-01-20 15:42:35.739 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:37.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:37.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.761 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.762 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.784 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.902 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.903 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.914 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:42:37 np0005588919 nova_compute[225855]: 2026-01-20 15:42:37.914 225859 INFO nova.compute.claims [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.068 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:42:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2809331689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.523 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.531 225859 DEBUG nova.compute.provider_tree [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.554 225859 DEBUG nova.scheduler.client.report [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.587 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.587 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.636 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.637 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.658 225859 INFO nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.678 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.770 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.772 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.772 225859 INFO nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Creating image(s)#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.796 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.822 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.845 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.849 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.915 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.916 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.917 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.917 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.943 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:38 np0005588919 nova_compute[225855]: 2026-01-20 15:42:38.947 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.153 225859 DEBUG nova.policy [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5985ef736503499a9f1d734cabc33ce5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.241 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.321 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] resizing rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:42:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:39.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.426 225859 DEBUG nova.objects.instance [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'migration_context' on Instance uuid 6c6a79bf-04d3-4839-84cc-ab8b383d602c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.440 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.441 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Ensure instance console log exists: /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.441 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.442 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.442 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.569 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:39.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:39.792 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:42:39 np0005588919 nova_compute[225855]: 2026-01-20 15:42:39.792 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:39 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:39.793 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:42:40 np0005588919 nova_compute[225855]: 2026-01-20 15:42:40.199 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Successfully created port: 9af895a3-cca7-495f-ab5a-68e04355f005 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:42:40 np0005588919 nova_compute[225855]: 2026-01-20 15:42:40.741 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:41 np0005588919 podman[328406]: 2026-01-20 15:42:41.004658794 +0000 UTC m=+0.056672233 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.233 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Successfully updated port: 9af895a3-cca7-495f-ab5a-68e04355f005 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.254 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.254 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquired lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.254 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.374 225859 DEBUG nova.compute.manager [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-changed-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.375 225859 DEBUG nova.compute.manager [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Refreshing instance network info cache due to event network-changed-9af895a3-cca7-495f-ab5a-68e04355f005. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.375 225859 DEBUG oslo_concurrency.lockutils [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:42:41 np0005588919 nova_compute[225855]: 2026-01-20 15:42:41.407 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:42:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:41.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:41.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.321 225859 DEBUG nova.network.neutron [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Updating instance_info_cache with network_info: [{"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.340 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Releasing lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.341 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Instance network_info: |[{"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.342 225859 DEBUG oslo_concurrency.lockutils [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.342 225859 DEBUG nova.network.neutron [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Refreshing network info cache for port 9af895a3-cca7-495f-ab5a-68e04355f005 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.346 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Start _get_guest_xml network_info=[{"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.350 225859 WARNING nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.355 225859 DEBUG nova.virt.libvirt.host [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.355 225859 DEBUG nova.virt.libvirt.host [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.362 225859 DEBUG nova.virt.libvirt.host [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.363 225859 DEBUG nova.virt.libvirt.host [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.364 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.365 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.365 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.366 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.366 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.366 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.367 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.367 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.367 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.368 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.368 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.369 225859 DEBUG nova.virt.hardware [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.372 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:42 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:42:42 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2863498301' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.824 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.859 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:42 np0005588919 nova_compute[225855]: 2026-01-20 15:42:42.864 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:43 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:42:43 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2419632345' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.359 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.362 225859 DEBUG nova.virt.libvirt.vif [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:42:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=219,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNPrC26E5zjpds8PmYXeLNQKBwLdgsc+VcubrdKnriEXDiMjUXGvx1Qk1D9X7eLck7XYpiSHt4U9t1SsZB3lsAeahV1YqeLst2/p8UQkxJjHaCXNOlF5uwsraAqiSop7uA==',key_name='tempest-TestSecurityGroupsBasicOps-681797586',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-s0krodc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:42:38Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=6c6a79bf-04d3-4839-84cc-ab8b383d602c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.362 225859 DEBUG nova.network.os_vif_util [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.363 225859 DEBUG nova.network.os_vif_util [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.365 225859 DEBUG nova.objects.instance [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c6a79bf-04d3-4839-84cc-ab8b383d602c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.382 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <uuid>6c6a79bf-04d3-4839-84cc-ab8b383d602c</uuid>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <name>instance-000000db</name>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310</nova:name>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:42:42</nova:creationTime>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:user uuid="5985ef736503499a9f1d734cabc33ce5">tempest-TestSecurityGroupsBasicOps-342561427-project-member</nova:user>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:project uuid="728662ec7f654a3fb2e53a90b8707d7e">tempest-TestSecurityGroupsBasicOps-342561427</nova:project>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <nova:port uuid="9af895a3-cca7-495f-ab5a-68e04355f005">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="serial">6c6a79bf-04d3-4839-84cc-ab8b383d602c</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="uuid">6c6a79bf-04d3-4839-84cc-ab8b383d602c</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:48:2c:f1"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <target dev="tap9af895a3-cc"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/console.log" append="off"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:42:43 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:42:43 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:42:43 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:42:43 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.384 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Preparing to wait for external event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.385 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.385 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.385 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.386 225859 DEBUG nova.virt.libvirt.vif [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:42:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=219,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNPrC26E5zjpds8PmYXeLNQKBwLdgsc+VcubrdKnriEXDiMjUXGvx1Qk1D9X7eLck7XYpiSHt4U9t1SsZB3lsAeahV1YqeLst2/p8UQkxJjHaCXNOlF5uwsraAqiSop7uA==',key_name='tempest-TestSecurityGroupsBasicOps-681797586',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-s0krodc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:42:38Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=6c6a79bf-04d3-4839-84cc-ab8b383d602c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.386 225859 DEBUG nova.network.os_vif_util [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.387 225859 DEBUG nova.network.os_vif_util [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.388 225859 DEBUG os_vif [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.389 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.389 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.390 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.393 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.394 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9af895a3-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.395 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9af895a3-cc, col_values=(('external_ids', {'iface-id': '9af895a3-cca7-495f-ab5a-68e04355f005', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:2c:f1', 'vm-uuid': '6c6a79bf-04d3-4839-84cc-ab8b383d602c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.396 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:43 np0005588919 NetworkManager[49104]: <info>  [1768923763.3977] manager: (tap9af895a3-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.400 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.403 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.404 225859 INFO os_vif [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc')#033[00m
Jan 20 10:42:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:43.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.456 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.457 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.457 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No VIF found with MAC fa:16:3e:48:2c:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.457 225859 INFO nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Using config drive#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.487 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.507 225859 DEBUG nova.network.neutron [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Updated VIF entry in instance network info cache for port 9af895a3-cca7-495f-ab5a-68e04355f005. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.507 225859 DEBUG nova.network.neutron [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Updating instance_info_cache with network_info: [{"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.530 225859 DEBUG oslo_concurrency.lockutils [req-fe27a064-2531-409e-b4ea-10a7562a43fa req-9349ded2-faf5-4942-89ae-d68358c868cb 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-6c6a79bf-04d3-4839-84cc-ab8b383d602c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:42:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:43.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.810 225859 INFO nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Creating config drive at /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.816 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfcg06clv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:43 np0005588919 nova_compute[225855]: 2026-01-20 15:42:43.968 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfcg06clv" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.002 225859 DEBUG nova.storage.rbd_utils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.008 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:42:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.200 225859 DEBUG oslo_concurrency.processutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config 6c6a79bf-04d3-4839-84cc-ab8b383d602c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.201 225859 INFO nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Deleting local config drive /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:42:44 np0005588919 kernel: tap9af895a3-cc: entered promiscuous mode
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.2592] manager: (tap9af895a3-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/427)
Jan 20 10:42:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:44Z|00991|binding|INFO|Claiming lport 9af895a3-cca7-495f-ab5a-68e04355f005 for this chassis.
Jan 20 10:42:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:44Z|00992|binding|INFO|9af895a3-cca7-495f-ab5a-68e04355f005: Claiming fa:16:3e:48:2c:f1 10.100.0.3
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.264 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.269 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.2705] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.2713] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.282 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:2c:f1 10.100.0.3'], port_security=['fa:16:3e:48:2c:f1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6c6a79bf-04d3-4839-84cc-ab8b383d602c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b7424c3a-5aee-4d68-a5d7-51752094553b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=909829b9-c0dd-4f89-9095-7f817ccefae3, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9af895a3-cca7-495f-ab5a-68e04355f005) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.288 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9af895a3-cca7-495f-ab5a-68e04355f005 in datapath b5fb4ee9-fa45-4797-871a-53247ebaf43e bound to our chassis#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.289 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5fb4ee9-fa45-4797-871a-53247ebaf43e#033[00m
Jan 20 10:42:44 np0005588919 systemd-udevd[328564]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.305 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5a253a-5f03-4aa0-8eae-b1e32862bc37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.307 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5fb4ee9-f1 in ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:42:44 np0005588919 systemd-machined[194361]: New machine qemu-114-instance-000000db.
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.309 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5fb4ee9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.309 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[35ff0543-6b70-4eda-9bf0-08e4cf246e15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.310 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff5be4a-f9c6-4152-a5ff-7933df0adef9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.3259] device (tap9af895a3-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.324 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[35fefe0c-7af3-4f27-8567-60921b972a64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.3265] device (tap9af895a3-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:42:44 np0005588919 systemd[1]: Started Virtual Machine qemu-114-instance-000000db.
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.350 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5d26acf2-dd72-4653-929b-a76c12af15b5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.364 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.374 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:44Z|00993|binding|INFO|Setting lport 9af895a3-cca7-495f-ab5a-68e04355f005 ovn-installed in OVS
Jan 20 10:42:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:44Z|00994|binding|INFO|Setting lport 9af895a3-cca7-495f-ab5a-68e04355f005 up in Southbound
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.389 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.397 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[1acdaeec-ccb0-43fc-a5b5-7f12d6266128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.4036] manager: (tapb5fb4ee9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/430)
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.402 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[c15397cf-691f-4355-9122-2a4d302f81b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.439 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[812d3161-0afb-4dd9-a279-5d9604fdde49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.443 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[177202a8-4b9e-433f-91b4-4e2b2958c2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.4700] device (tapb5fb4ee9-f0): carrier: link connected
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.476 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[e7df1dcb-0f80-4913-8d97-482340ff741e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.496 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[4994cfa7-33d1-4dc3-a12f-1c4de87818e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5fb4ee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:58:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880339, 'reachable_time': 34549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 328596, 'error': None, 'target': 'ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.524 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2b1a45-e0e2-4e35-bf39-58f8be0030c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe42:585a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 880339, 'tstamp': 880339}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 328597, 'error': None, 'target': 'ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.557 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[efdf8cc8-0ce3-4693-92d0-bd1dec40fd6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5fb4ee9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:58:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 283], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880339, 'reachable_time': 34549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 328598, 'error': None, 'target': 'ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.570 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.607 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e697b7e9-db74-4219-8f4b-a1409365fd28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.691 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[6de1a2aa-067c-49f7-87aa-28e8a423ae1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.694 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5fb4ee9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.694 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.695 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5fb4ee9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.697 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 NetworkManager[49104]: <info>  [1768923764.6985] manager: (tapb5fb4ee9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Jan 20 10:42:44 np0005588919 kernel: tapb5fb4ee9-f0: entered promiscuous mode
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.702 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5fb4ee9-f0, col_values=(('external_ids', {'iface-id': '92b999b0-5595-47ff-ac54-cb52d2ba58ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:44 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:44Z|00995|binding|INFO|Releasing lport 92b999b0-5595-47ff-ac54-cb52d2ba58ba from this chassis (sb_readonly=0)
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.704 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.706 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5fb4ee9-fa45-4797-871a-53247ebaf43e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5fb4ee9-fa45-4797-871a-53247ebaf43e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.708 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[bd889d86-9e88-435e-8628-634de0ca366a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.709 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-b5fb4ee9-fa45-4797-871a-53247ebaf43e
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/b5fb4ee9-fa45-4797-871a-53247ebaf43e.pid.haproxy
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID b5fb4ee9-fa45-4797-871a-53247ebaf43e
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:42:44 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:44.712 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'env', 'PROCESS_TAG=haproxy-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5fb4ee9-fa45-4797-871a-53247ebaf43e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.717 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.792 225859 DEBUG nova.compute.manager [req-4fb91efe-8ee7-41fa-ba82-f31139ae9958 req-123402e4-aabd-41cf-9f52-a2a1f1e6709a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.793 225859 DEBUG oslo_concurrency.lockutils [req-4fb91efe-8ee7-41fa-ba82-f31139ae9958 req-123402e4-aabd-41cf-9f52-a2a1f1e6709a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.793 225859 DEBUG oslo_concurrency.lockutils [req-4fb91efe-8ee7-41fa-ba82-f31139ae9958 req-123402e4-aabd-41cf-9f52-a2a1f1e6709a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.793 225859 DEBUG oslo_concurrency.lockutils [req-4fb91efe-8ee7-41fa-ba82-f31139ae9958 req-123402e4-aabd-41cf-9f52-a2a1f1e6709a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.793 225859 DEBUG nova.compute.manager [req-4fb91efe-8ee7-41fa-ba82-f31139ae9958 req-123402e4-aabd-41cf-9f52-a2a1f1e6709a 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Processing event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.859 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.860 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923764.8585937, 6c6a79bf-04d3-4839-84cc-ab8b383d602c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.860 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.865 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.869 225859 INFO nova.virt.libvirt.driver [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Instance spawned successfully.#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.869 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.882 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.888 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.893 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.893 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.894 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.894 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.894 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.895 225859 DEBUG nova.virt.libvirt.driver [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.936 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.936 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923764.858916, 6c6a79bf-04d3-4839-84cc-ab8b383d602c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.936 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.961 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.964 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923764.8642828, 6c6a79bf-04d3-4839-84cc-ab8b383d602c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.964 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.970 225859 INFO nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Took 6.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.971 225859 DEBUG nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.980 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:42:44 np0005588919 nova_compute[225855]: 2026-01-20 15:42:44.983 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:42:45 np0005588919 nova_compute[225855]: 2026-01-20 15:42:45.011 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:42:45 np0005588919 podman[328672]: 2026-01-20 15:42:45.069714728 +0000 UTC m=+0.056724054 container create 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:42:45 np0005588919 systemd[1]: Started libpod-conmon-03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75.scope.
Jan 20 10:42:45 np0005588919 podman[328672]: 2026-01-20 15:42:45.036194171 +0000 UTC m=+0.023203527 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:42:45 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:42:45 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b5e8f4f94618ce44091fb433750cac7f4bda279532d391abb9a2cd7dfef8589/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:42:45 np0005588919 podman[328672]: 2026-01-20 15:42:45.165439114 +0000 UTC m=+0.152448520 container init 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:42:45 np0005588919 podman[328672]: 2026-01-20 15:42:45.173421769 +0000 UTC m=+0.160431125 container start 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 20 10:42:45 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [NOTICE]   (328691) : New worker (328693) forked
Jan 20 10:42:45 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [NOTICE]   (328691) : Loading success.
Jan 20 10:42:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:45.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:45.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:45 np0005588919 nova_compute[225855]: 2026-01-20 15:42:45.713 225859 INFO nova.compute.manager [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Took 7.85 seconds to build instance.#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.485 225859 DEBUG oslo_concurrency.lockutils [None req-7aa3daef-dee4-4751-afd5-b672c4cbb8af 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.883 225859 DEBUG nova.compute.manager [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.884 225859 DEBUG oslo_concurrency.lockutils [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.884 225859 DEBUG oslo_concurrency.lockutils [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.885 225859 DEBUG oslo_concurrency.lockutils [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.885 225859 DEBUG nova.compute.manager [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] No waiting events found dispatching network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:42:46 np0005588919 nova_compute[225855]: 2026-01-20 15:42:46.886 225859 WARNING nova.compute.manager [req-95cc77aa-584b-4791-bdb8-970c47fd8621 req-c1fe608a-c649-4f5e-aa71-7c13b714f6f7 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received unexpected event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:42:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:47.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:47.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:48 np0005588919 nova_compute[225855]: 2026-01-20 15:42:48.398 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:49.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:49 np0005588919 nova_compute[225855]: 2026-01-20 15:42:49.572 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:49.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:49 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:42:49.795 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:42:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:42:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:51.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:42:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:51.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:53 np0005588919 nova_compute[225855]: 2026-01-20 15:42:53.401 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:53.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 10:42:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:53.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 10:42:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:54 np0005588919 nova_compute[225855]: 2026-01-20 15:42:54.574 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:42:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:42:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:55.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:56 np0005588919 podman[328933]: 2026-01-20 15:42:56.110569118 +0000 UTC m=+0.067455087 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 20 10:42:56 np0005588919 podman[328933]: 2026-01-20 15:42:56.207254591 +0000 UTC m=+0.164140540 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 20 10:42:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:56 np0005588919 podman[329084]: 2026-01-20 15:42:56.833607171 +0000 UTC m=+0.052843744 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:42:56 np0005588919 podman[329084]: 2026-01-20 15:42:56.844272323 +0000 UTC m=+0.063508896 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:42:57 np0005588919 podman[329150]: 2026-01-20 15:42:57.03629394 +0000 UTC m=+0.051503547 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, version=2.2.4, name=keepalived, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git)
Jan 20 10:42:57 np0005588919 podman[329150]: 2026-01-20 15:42:57.049231065 +0000 UTC m=+0.064440682 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, distribution-scope=public, name=keepalived, release=1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2)
Jan 20 10:42:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:57.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:57.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:42:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:42:58 np0005588919 nova_compute[225855]: 2026-01-20 15:42:58.405 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:58Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:2c:f1 10.100.0.3
Jan 20 10:42:58 np0005588919 ovn_controller[130490]: 2026-01-20T15:42:58Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:2c:f1 10.100.0.3
Jan 20 10:42:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:42:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:42:59.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:42:59 np0005588919 nova_compute[225855]: 2026-01-20 15:42:59.576 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:42:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:42:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:42:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:42:59.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:01.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:01.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:03 np0005588919 nova_compute[225855]: 2026-01-20 15:43:03.411 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 10:43:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:03.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 10:43:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:03.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:04 np0005588919 podman[329317]: 2026-01-20 15:43:04.168501836 +0000 UTC m=+0.194534389 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 20 10:43:04 np0005588919 nova_compute[225855]: 2026-01-20 15:43:04.580 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:43:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.439 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.440 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.440 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.440 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:05.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.441 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.442 225859 INFO nova.compute.manager [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Terminating instance#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.443 225859 DEBUG nova.compute.manager [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:43:05 np0005588919 kernel: tap9af895a3-cc (unregistering): left promiscuous mode
Jan 20 10:43:05 np0005588919 NetworkManager[49104]: <info>  [1768923785.5031] device (tap9af895a3-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.518 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:43:05Z|00996|binding|INFO|Releasing lport 9af895a3-cca7-495f-ab5a-68e04355f005 from this chassis (sb_readonly=0)
Jan 20 10:43:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:43:05Z|00997|binding|INFO|Setting lport 9af895a3-cca7-495f-ab5a-68e04355f005 down in Southbound
Jan 20 10:43:05 np0005588919 ovn_controller[130490]: 2026-01-20T15:43:05Z|00998|binding|INFO|Removing iface tap9af895a3-cc ovn-installed in OVS
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.521 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.558 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.563 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:2c:f1 10.100.0.3'], port_security=['fa:16:3e:48:2c:f1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6c6a79bf-04d3-4839-84cc-ab8b383d602c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b7424c3a-5aee-4d68-a5d7-51752094553b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=909829b9-c0dd-4f89-9095-7f817ccefae3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=9af895a3-cca7-495f-ab5a-68e04355f005) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.565 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 9af895a3-cca7-495f-ab5a-68e04355f005 in datapath b5fb4ee9-fa45-4797-871a-53247ebaf43e unbound from our chassis#033[00m
Jan 20 10:43:05 np0005588919 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d000000db.scope: Deactivated successfully.
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.568 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5fb4ee9-fa45-4797-871a-53247ebaf43e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:43:05 np0005588919 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d000000db.scope: Consumed 14.573s CPU time.
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.570 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1f888135-93ff-4f2b-9988-cb3084c2d1b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.571 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e namespace which is not needed anymore#033[00m
Jan 20 10:43:05 np0005588919 systemd-machined[194361]: Machine qemu-114-instance-000000db terminated.
Jan 20 10:43:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:05.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.687 225859 INFO nova.virt.libvirt.driver [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Instance destroyed successfully.#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.688 225859 DEBUG nova.objects.instance [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'resources' on Instance uuid 6c6a79bf-04d3-4839-84cc-ab8b383d602c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.705 225859 DEBUG nova.virt.libvirt.vif [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:42:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-0-248124310',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=219,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNPrC26E5zjpds8PmYXeLNQKBwLdgsc+VcubrdKnriEXDiMjUXGvx1Qk1D9X7eLck7XYpiSHt4U9t1SsZB3lsAeahV1YqeLst2/p8UQkxJjHaCXNOlF5uwsraAqiSop7uA==',key_name='tempest-TestSecurityGroupsBasicOps-681797586',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:42:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-s0krodc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:42:45Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=6c6a79bf-04d3-4839-84cc-ab8b383d602c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.706 225859 DEBUG nova.network.os_vif_util [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "9af895a3-cca7-495f-ab5a-68e04355f005", "address": "fa:16:3e:48:2c:f1", "network": {"id": "b5fb4ee9-fa45-4797-871a-53247ebaf43e", "bridge": "br-int", "label": "tempest-network-smoke--639703376", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9af895a3-cc", "ovs_interfaceid": "9af895a3-cca7-495f-ab5a-68e04355f005", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.707 225859 DEBUG nova.network.os_vif_util [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.708 225859 DEBUG os_vif [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.710 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.710 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9af895a3-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.713 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.718 225859 INFO os_vif [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:2c:f1,bridge_name='br-int',has_traffic_filtering=True,id=9af895a3-cca7-495f-ab5a-68e04355f005,network=Network(b5fb4ee9-fa45-4797-871a-53247ebaf43e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9af895a3-cc')#033[00m
Jan 20 10:43:05 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [NOTICE]   (328691) : haproxy version is 2.8.14-c23fe91
Jan 20 10:43:05 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [NOTICE]   (328691) : path to executable is /usr/sbin/haproxy
Jan 20 10:43:05 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [WARNING]  (328691) : Exiting Master process...
Jan 20 10:43:05 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [ALERT]    (328691) : Current worker (328693) exited with code 143 (Terminated)
Jan 20 10:43:05 np0005588919 neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e[328687]: [WARNING]  (328691) : All workers exited. Exiting... (0)
Jan 20 10:43:05 np0005588919 systemd[1]: libpod-03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75.scope: Deactivated successfully.
Jan 20 10:43:05 np0005588919 podman[329419]: 2026-01-20 15:43:05.72977551 +0000 UTC m=+0.052532966 container died 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:43:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75-userdata-shm.mount: Deactivated successfully.
Jan 20 10:43:05 np0005588919 systemd[1]: var-lib-containers-storage-overlay-4b5e8f4f94618ce44091fb433750cac7f4bda279532d391abb9a2cd7dfef8589-merged.mount: Deactivated successfully.
Jan 20 10:43:05 np0005588919 podman[329419]: 2026-01-20 15:43:05.76801655 +0000 UTC m=+0.090773996 container cleanup 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:43:05 np0005588919 systemd[1]: libpod-conmon-03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75.scope: Deactivated successfully.
Jan 20 10:43:05 np0005588919 podman[329474]: 2026-01-20 15:43:05.821635726 +0000 UTC m=+0.035097353 container remove 03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.827 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[d5604fbd-240e-4516-b55c-e977d1c758d2]: (4, ('Tue Jan 20 03:43:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e (03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75)\n03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75\nTue Jan 20 03:43:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e (03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75)\n03cd23c47de70dcff18cd70c75122fd429c6ff5e41750880fef510538ae5ef75\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.828 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ba94d232-79a2-4ae4-a26a-f004fde9e135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.829 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5fb4ee9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:05 np0005588919 kernel: tapb5fb4ee9-f0: left promiscuous mode
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.833 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 nova_compute[225855]: 2026-01-20 15:43:05.843 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.847 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d8f79c-2b06-4550-aca5-0d81c4d1f4fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.864 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[9a159ce1-3b23-4a2e-b9a1-be6eb7bfc0fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.865 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[38098c07-1784-4e95-946b-185fc0ef5d12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.882 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[795445f3-9453-4cd2-abb6-ff297bce20e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880331, 'reachable_time': 44384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 329490, 'error': None, 'target': 'ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.885 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5fb4ee9-fa45-4797-871a-53247ebaf43e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:43:05 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:05.886 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[16dc837e-0af1-4db2-8c3b-fe64673e7dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:43:05 np0005588919 systemd[1]: run-netns-ovnmeta\x2db5fb4ee9\x2dfa45\x2d4797\x2d871a\x2d53247ebaf43e.mount: Deactivated successfully.
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.078 225859 INFO nova.virt.libvirt.driver [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Deleting instance files /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c_del#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.079 225859 INFO nova.virt.libvirt.driver [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Deletion of /var/lib/nova/instances/6c6a79bf-04d3-4839-84cc-ab8b383d602c_del complete#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.352 225859 INFO nova.compute.manager [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.353 225859 DEBUG oslo.service.loopingcall [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.354 225859 DEBUG nova.compute.manager [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.354 225859 DEBUG nova.network.neutron [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.442 225859 DEBUG nova.compute.manager [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-unplugged-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.442 225859 DEBUG oslo_concurrency.lockutils [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.443 225859 DEBUG oslo_concurrency.lockutils [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.443 225859 DEBUG oslo_concurrency.lockutils [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.443 225859 DEBUG nova.compute.manager [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] No waiting events found dispatching network-vif-unplugged-9af895a3-cca7-495f-ab5a-68e04355f005 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:43:06 np0005588919 nova_compute[225855]: 2026-01-20 15:43:06.443 225859 DEBUG nova.compute.manager [req-83a5d9a7-fcc0-45de-bee9-e3eac7b07931 req-4a5775bf-736b-4def-8e67-7e9d4f4d2886 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-unplugged-9af895a3-cca7-495f-ab5a-68e04355f005 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:43:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:07.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:07 np0005588919 nova_compute[225855]: 2026-01-20 15:43:07.448 225859 DEBUG nova.network.neutron [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:43:07 np0005588919 nova_compute[225855]: 2026-01-20 15:43:07.463 225859 INFO nova.compute.manager [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Took 1.11 seconds to deallocate network for instance.#033[00m
Jan 20 10:43:07 np0005588919 nova_compute[225855]: 2026-01-20 15:43:07.515 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:07 np0005588919 nova_compute[225855]: 2026-01-20 15:43:07.516 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:07 np0005588919 nova_compute[225855]: 2026-01-20 15:43:07.581 225859 DEBUG oslo_concurrency.processutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:07.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:08 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:08 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4178163900' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.068 225859 DEBUG oslo_concurrency.processutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.075 225859 DEBUG nova.compute.provider_tree [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.137 225859 DEBUG nova.scheduler.client.report [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.160 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.189 225859 INFO nova.scheduler.client.report [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Deleted allocations for instance 6c6a79bf-04d3-4839-84cc-ab8b383d602c#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.257 225859 DEBUG oslo_concurrency.lockutils [None req-011eca6b-40b4-4a38-bba5-ce034d3c98eb 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.524 225859 DEBUG nova.compute.manager [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.525 225859 DEBUG oslo_concurrency.lockutils [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.525 225859 DEBUG oslo_concurrency.lockutils [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.525 225859 DEBUG oslo_concurrency.lockutils [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "6c6a79bf-04d3-4839-84cc-ab8b383d602c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.526 225859 DEBUG nova.compute.manager [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] No waiting events found dispatching network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.526 225859 WARNING nova.compute.manager [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received unexpected event network-vif-plugged-9af895a3-cca7-495f-ab5a-68e04355f005 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:43:08 np0005588919 nova_compute[225855]: 2026-01-20 15:43:08.527 225859 DEBUG nova.compute.manager [req-7b4f883c-3014-4002-a0b2-08c791dafe63 req-c6584fb1-9b0b-404f-90de-19f41af7fc94 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Received event network-vif-deleted-9af895a3-cca7-495f-ab5a-68e04355f005 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:43:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:09.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:09 np0005588919 nova_compute[225855]: 2026-01-20 15:43:09.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:09.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:10 np0005588919 nova_compute[225855]: 2026-01-20 15:43:10.713 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:11.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:11.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:12 np0005588919 podman[329517]: 2026-01-20 15:43:12.003680759 +0000 UTC m=+0.052261518 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:43:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:13.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:43:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/454088694' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:43:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:43:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/454088694' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:43:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:13.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:14 np0005588919 nova_compute[225855]: 2026-01-20 15:43:14.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:15.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:15.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:15 np0005588919 nova_compute[225855]: 2026-01-20 15:43:15.716 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:16.465 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:16.465 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:16.465 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:17.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:17.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:19.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:19 np0005588919 nova_compute[225855]: 2026-01-20 15:43:19.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:19 np0005588919 nova_compute[225855]: 2026-01-20 15:43:19.694 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:19.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:19 np0005588919 nova_compute[225855]: 2026-01-20 15:43:19.768 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:20 np0005588919 nova_compute[225855]: 2026-01-20 15:43:20.684 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923785.6833277, 6c6a79bf-04d3-4839-84cc-ab8b383d602c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:43:20 np0005588919 nova_compute[225855]: 2026-01-20 15:43:20.684 225859 INFO nova.compute.manager [-] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:43:20 np0005588919 nova_compute[225855]: 2026-01-20 15:43:20.718 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:20 np0005588919 nova_compute[225855]: 2026-01-20 15:43:20.983 225859 DEBUG nova.compute.manager [None req-c5a9e3e5-fc1b-467a-b510-071df4d8b152 - - - - - -] [instance: 6c6a79bf-04d3-4839-84cc-ab8b383d602c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:43:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:21.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:21.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:23.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:23.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:24 np0005588919 nova_compute[225855]: 2026-01-20 15:43:24.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:24 np0005588919 nova_compute[225855]: 2026-01-20 15:43:24.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:43:24 np0005588919 nova_compute[225855]: 2026-01-20 15:43:24.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:43:24 np0005588919 nova_compute[225855]: 2026-01-20 15:43:24.407 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:43:24 np0005588919 nova_compute[225855]: 2026-01-20 15:43:24.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:25.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:25.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:25 np0005588919 nova_compute[225855]: 2026-01-20 15:43:25.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:26 np0005588919 nova_compute[225855]: 2026-01-20 15:43:26.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:26 np0005588919 nova_compute[225855]: 2026-01-20 15:43:26.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:43:27 np0005588919 ceph-mgr[82135]: client.0 ms_handle_reset on v2:192.168.122.100:6800/2542147622
Jan 20 10:43:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:27.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:27.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.370 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.371 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.371 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.372 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.372 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3199149310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.831 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.990 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.991 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4230MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.991 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:43:28 np0005588919 nova_compute[225855]: 2026-01-20 15:43:28.991 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.150 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.151 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:43:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.291 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:43:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:29.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.589 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:43:29 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/204372452' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.706 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.711 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:43:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:29.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.727 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.753 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:43:29 np0005588919 nova_compute[225855]: 2026-01-20 15:43:29.753 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:43:30 np0005588919 nova_compute[225855]: 2026-01-20 15:43:30.722 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:30 np0005588919 nova_compute[225855]: 2026-01-20 15:43:30.754 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:30 np0005588919 nova_compute[225855]: 2026-01-20 15:43:30.754 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:31.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:31.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:33.221 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:43:33 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:33.222 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:43:33 np0005588919 nova_compute[225855]: 2026-01-20 15:43:33.223 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:33 np0005588919 nova_compute[225855]: 2026-01-20 15:43:33.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:33 np0005588919 nova_compute[225855]: 2026-01-20 15:43:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:33.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:33.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:34 np0005588919 nova_compute[225855]: 2026-01-20 15:43:34.592 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:35 np0005588919 podman[329694]: 2026-01-20 15:43:35.033258866 +0000 UTC m=+0.079032205 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 20 10:43:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:35.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:35 np0005588919 nova_compute[225855]: 2026-01-20 15:43:35.725 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:35.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:36 np0005588919 nova_compute[225855]: 2026-01-20 15:43:36.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:37.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:37.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:39.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:39 np0005588919 nova_compute[225855]: 2026-01-20 15:43:39.593 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:39.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:40 np0005588919 nova_compute[225855]: 2026-01-20 15:43:40.727 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:41.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:43:42.224 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:43:43 np0005588919 podman[329727]: 2026-01-20 15:43:43.003900977 +0000 UTC m=+0.043115559 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:43:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:43.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:44 np0005588919 nova_compute[225855]: 2026-01-20 15:43:44.595 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:45.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:45.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:45 np0005588919 nova_compute[225855]: 2026-01-20 15:43:45.771 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:47.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:47.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:49.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:49 np0005588919 nova_compute[225855]: 2026-01-20 15:43:49.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:43:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:49.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:43:50 np0005588919 nova_compute[225855]: 2026-01-20 15:43:50.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:51.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:51.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:53.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:43:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:53.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:43:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:54 np0005588919 nova_compute[225855]: 2026-01-20 15:43:54.644 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:55 np0005588919 nova_compute[225855]: 2026-01-20 15:43:55.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:43:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:55.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:55.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:55 np0005588919 nova_compute[225855]: 2026-01-20 15:43:55.860 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:57.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:57.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:43:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:43:59.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:43:59 np0005588919 nova_compute[225855]: 2026-01-20 15:43:59.691 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:43:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:43:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:43:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:43:59.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:00 np0005588919 nova_compute[225855]: 2026-01-20 15:44:00.899 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:01 np0005588919 ovn_controller[130490]: 2026-01-20T15:44:01Z|00999|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 20 10:44:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.004000113s ======
Jan 20 10:44:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:01.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000113s
Jan 20 10:44:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:01.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:03.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:03.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:04 np0005588919 nova_compute[225855]: 2026-01-20 15:44:04.730 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:05.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:44:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:05 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:44:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:05.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:05 np0005588919 nova_compute[225855]: 2026-01-20 15:44:05.943 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:06 np0005588919 podman[329941]: 2026-01-20 15:44:06.055941768 +0000 UTC m=+0.088184572 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 20 10:44:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:07.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:07.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:09.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:09 np0005588919 nova_compute[225855]: 2026-01-20 15:44:09.733 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:09.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:10 np0005588919 nova_compute[225855]: 2026-01-20 15:44:10.946 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:11.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:11 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:44:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:13.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:44:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/889081625' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:44:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:44:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/889081625' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:44:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:13.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:13 np0005588919 podman[330046]: 2026-01-20 15:44:13.850882754 +0000 UTC m=+0.046476774 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:44:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:14 np0005588919 nova_compute[225855]: 2026-01-20 15:44:14.735 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:15.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:15.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:15 np0005588919 nova_compute[225855]: 2026-01-20 15:44:15.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:16.466 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:16.466 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:16.466 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:17.177 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:44:17 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:17.178 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:44:17 np0005588919 nova_compute[225855]: 2026-01-20 15:44:17.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:17.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:17.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:19.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:19 np0005588919 nova_compute[225855]: 2026-01-20 15:44:19.736 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:19.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:20 np0005588919 nova_compute[225855]: 2026-01-20 15:44:20.988 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:44:21.180 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:44:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:21.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:21.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:23.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:23.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:24 np0005588919 nova_compute[225855]: 2026-01-20 15:44:24.738 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:25 np0005588919 nova_compute[225855]: 2026-01-20 15:44:25.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:25 np0005588919 nova_compute[225855]: 2026-01-20 15:44:25.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:44:25 np0005588919 nova_compute[225855]: 2026-01-20 15:44:25.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:44:25 np0005588919 nova_compute[225855]: 2026-01-20 15:44:25.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:44:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:25.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:25 np0005588919 nova_compute[225855]: 2026-01-20 15:44:25.990 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:27.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:28 np0005588919 nova_compute[225855]: 2026-01-20 15:44:28.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:28 np0005588919 nova_compute[225855]: 2026-01-20 15:44:28.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:44:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:29 np0005588919 nova_compute[225855]: 2026-01-20 15:44:29.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:29.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:29 np0005588919 nova_compute[225855]: 2026-01-20 15:44:29.740 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:29.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.364 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.365 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.365 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.365 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.366 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2706026692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.804 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.953 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.954 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4253MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.954 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.955 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:44:30 np0005588919 nova_compute[225855]: 2026-01-20 15:44:30.991 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.025 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.026 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.043 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.067 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.067 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.097 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.128 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.176 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:44:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:31.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:31 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:44:31 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1480609939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.616 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.621 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.636 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.638 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:44:31 np0005588919 nova_compute[225855]: 2026-01-20 15:44:31.638 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:44:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:31.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:32 np0005588919 nova_compute[225855]: 2026-01-20 15:44:32.639 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:32 np0005588919 nova_compute[225855]: 2026-01-20 15:44:32.639 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:33 np0005588919 nova_compute[225855]: 2026-01-20 15:44:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:33 np0005588919 nova_compute[225855]: 2026-01-20 15:44:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:33.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:33.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:34 np0005588919 nova_compute[225855]: 2026-01-20 15:44:34.782 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:35.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:35.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:35 np0005588919 nova_compute[225855]: 2026-01-20 15:44:35.993 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:36 np0005588919 podman[330197]: 2026-01-20 15:44:36.297634409 +0000 UTC m=+0.086106824 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:44:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:37.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:37.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:38 np0005588919 nova_compute[225855]: 2026-01-20 15:44:38.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:39.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:39 np0005588919 nova_compute[225855]: 2026-01-20 15:44:39.784 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:39.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:40 np0005588919 nova_compute[225855]: 2026-01-20 15:44:40.995 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:41.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:44:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:43.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:44:44 np0005588919 podman[330228]: 2026-01-20 15:44:44.068205876 +0000 UTC m=+0.097430505 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:44:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:44 np0005588919 nova_compute[225855]: 2026-01-20 15:44:44.787 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:45.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:45.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:46 np0005588919 nova_compute[225855]: 2026-01-20 15:44:46.034 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:46 np0005588919 nova_compute[225855]: 2026-01-20 15:44:46.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:44:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:47.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:49.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:49 np0005588919 nova_compute[225855]: 2026-01-20 15:44:49.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:49.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:51 np0005588919 nova_compute[225855]: 2026-01-20 15:44:51.036 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:51.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:53.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:54 np0005588919 nova_compute[225855]: 2026-01-20 15:44:54.789 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:55.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:56 np0005588919 nova_compute[225855]: 2026-01-20 15:44:56.070 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:44:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:44:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:57.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:44:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:44:59.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:44:59 np0005588919 nova_compute[225855]: 2026-01-20 15:44:59.790 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:44:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:44:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:44:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:44:59.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:00 np0005588919 nova_compute[225855]: 2026-01-20 15:45:00.356 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:00 np0005588919 nova_compute[225855]: 2026-01-20 15:45:00.356 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:45:01 np0005588919 nova_compute[225855]: 2026-01-20 15:45:01.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:01.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:01.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:03.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:04 np0005588919 nova_compute[225855]: 2026-01-20 15:45:04.793 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:05.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:05.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:06 np0005588919 nova_compute[225855]: 2026-01-20 15:45:06.101 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:07 np0005588919 podman[330308]: 2026-01-20 15:45:07.070333368 +0000 UTC m=+0.107331544 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 20 10:45:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:07.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:07.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:09.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:09 np0005588919 nova_compute[225855]: 2026-01-20 15:45:09.795 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:09.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:11 np0005588919 nova_compute[225855]: 2026-01-20 15:45:11.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:11.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:11.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:12 np0005588919 nova_compute[225855]: 2026-01-20 15:45:12.361 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:12 np0005588919 nova_compute[225855]: 2026-01-20 15:45:12.362 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:45:12 np0005588919 nova_compute[225855]: 2026-01-20 15:45:12.378 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:45:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:45:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:12 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.047 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.048 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.067 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.159 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.160 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.167 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.167 225859 INFO nova.compute.claims [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.282 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4231837537' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.718 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.728 225859 DEBUG nova.compute.provider_tree [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.747 225859 DEBUG nova.scheduler.client.report [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.771 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.772 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.877 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.878 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 20 10:45:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:13.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.925 225859 INFO nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 20 10:45:13 np0005588919 nova_compute[225855]: 2026-01-20 15:45:13.977 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 20 10:45:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.216 225859 DEBUG nova.policy [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5985ef736503499a9f1d734cabc33ce5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.300 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.301 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.302 225859 INFO nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Creating image(s)#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.330 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.364 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.399 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.403 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:14 np0005588919 podman[330515]: 2026-01-20 15:45:14.423183709 +0000 UTC m=+0.081457173 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.465 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.466 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "82d5c1918fd7c974214c7a48c1793a7a82560462" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.467 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.467 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "82d5c1918fd7c974214c7a48c1793a7a82560462" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.491 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.495 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 464c0661-0ddb-4794-8959-db066827326c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.754 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/82d5c1918fd7c974214c7a48c1793a7a82560462 464c0661-0ddb-4794-8959-db066827326c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.258s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.823 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.831 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] resizing rbd image 464c0661-0ddb-4794-8959-db066827326c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.940 225859 DEBUG nova.objects.instance [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'migration_context' on Instance uuid 464c0661-0ddb-4794-8959-db066827326c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.955 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.956 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Ensure instance console log exists: /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.956 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.956 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:14 np0005588919 nova_compute[225855]: 2026-01-20 15:45:14.957 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:15 np0005588919 nova_compute[225855]: 2026-01-20 15:45:15.586 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Successfully created port: 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 20 10:45:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:15.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:15.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.182 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:16.467 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:16.468 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:16.469 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.543 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Successfully updated port: 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.558 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.559 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquired lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.559 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.639 225859 DEBUG nova.compute.manager [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.639 225859 DEBUG nova.compute.manager [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing instance network info cache due to event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.640 225859 DEBUG oslo_concurrency.lockutils [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:16 np0005588919 nova_compute[225855]: 2026-01-20 15:45:16.717 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.553 225859 DEBUG nova.network.neutron [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.573 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Releasing lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.573 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Instance network_info: |[{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.574 225859 DEBUG oslo_concurrency.lockutils [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.574 225859 DEBUG nova.network.neutron [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.576 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Start _get_guest_xml network_info=[{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'a32b3e07-16d8-46fd-9a7b-c242c432fcf9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.583 225859 WARNING nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.591 225859 DEBUG nova.virt.libvirt.host [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 20 10:45:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:17.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.593 225859 DEBUG nova.virt.libvirt.host [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.597 225859 DEBUG nova.virt.libvirt.host [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.598 225859 DEBUG nova.virt.libvirt.host [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.599 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.600 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-20T14:21:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='522deaab-a741-4dbb-932d-d8b13a211c33',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-20T14:21:57Z,direct_url=<?>,disk_format='qcow2',id=a32b3e07-16d8-46fd-9a7b-c242c432fcf9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4e7b863e1a5b4a8bb85e8466fecb8db2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-20T14:22:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.601 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.601 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.602 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.602 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.602 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.603 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.603 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.604 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.604 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.604 225859 DEBUG nova.virt.hardware [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 20 10:45:17 np0005588919 nova_compute[225855]: 2026-01-20 15:45:17.609 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:17.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1748637888' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.101 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.129 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.134 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/739713049' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.561 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.563 225859 DEBUG nova.virt.libvirt.vif [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:45:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=222,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-5t8zkczs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:45:14Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=464c0661-0ddb-4794-8959-db066827326c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.564 225859 DEBUG nova.network.os_vif_util [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.565 225859 DEBUG nova.network.os_vif_util [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.568 225859 DEBUG nova.objects.instance [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'pci_devices' on Instance uuid 464c0661-0ddb-4794-8959-db066827326c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.593 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] End _get_guest_xml xml=<domain type="kvm">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <uuid>464c0661-0ddb-4794-8959-db066827326c</uuid>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <name>instance-000000de</name>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <memory>131072</memory>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <vcpu>1</vcpu>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <metadata>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678</nova:name>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:creationTime>2026-01-20 15:45:17</nova:creationTime>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:flavor name="m1.nano">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:memory>128</nova:memory>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:disk>1</nova:disk>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:swap>0</nova:swap>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:ephemeral>0</nova:ephemeral>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:vcpus>1</nova:vcpus>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </nova:flavor>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:owner>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:user uuid="5985ef736503499a9f1d734cabc33ce5">tempest-TestSecurityGroupsBasicOps-342561427-project-member</nova:user>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:project uuid="728662ec7f654a3fb2e53a90b8707d7e">tempest-TestSecurityGroupsBasicOps-342561427</nova:project>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </nova:owner>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:root type="image" uuid="a32b3e07-16d8-46fd-9a7b-c242c432fcf9"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <nova:ports>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <nova:port uuid="2d94aa0d-ed38-41aa-9f34-5ed2a83a7304">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        </nova:port>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </nova:ports>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </nova:instance>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </metadata>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <sysinfo type="smbios">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <system>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="manufacturer">RDO</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="product">OpenStack Compute</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="serial">464c0661-0ddb-4794-8959-db066827326c</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="uuid">464c0661-0ddb-4794-8959-db066827326c</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <entry name="family">Virtual Machine</entry>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </system>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </sysinfo>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <os>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <boot dev="hd"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <smbios mode="sysinfo"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </os>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <features>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <acpi/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <apic/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <vmcoreinfo/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </features>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <clock offset="utc">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <timer name="pit" tickpolicy="delay"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <timer name="hpet" present="no"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </clock>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <cpu mode="custom" match="exact">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <model>Nehalem</model>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <topology sockets="1" cores="1" threads="1"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </cpu>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  <devices>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <disk type="network" device="disk">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/464c0661-0ddb-4794-8959-db066827326c_disk">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <target dev="vda" bus="virtio"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <disk type="network" device="cdrom">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <driver type="raw" cache="none"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <source protocol="rbd" name="vms/464c0661-0ddb-4794-8959-db066827326c_disk.config">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.100" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.102" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <host name="192.168.122.101" port="6789"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </source>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <auth username="openstack">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:        <secret type="ceph" uuid="e399cf45-e6b6-5393-99f1-75c601d3f188"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      </auth>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <target dev="sda" bus="sata"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </disk>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <interface type="ethernet">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <mac address="fa:16:3e:87:b1:99"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <driver name="vhost" rx_queue_size="512"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <mtu size="1442"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <target dev="tap2d94aa0d-ed"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </interface>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <serial type="pty">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <log file="/var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/console.log" append="off"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </serial>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <video>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <model type="virtio"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </video>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <input type="tablet" bus="usb"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <rng model="virtio">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <backend model="random">/dev/urandom</backend>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </rng>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="pci" model="pcie-root-port"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <controller type="usb" index="0"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    <memballoon model="virtio">
Jan 20 10:45:18 np0005588919 nova_compute[225855]:      <stats period="10"/>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:    </memballoon>
Jan 20 10:45:18 np0005588919 nova_compute[225855]:  </devices>
Jan 20 10:45:18 np0005588919 nova_compute[225855]: </domain>
Jan 20 10:45:18 np0005588919 nova_compute[225855]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.595 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Preparing to wait for external event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.595 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.595 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.596 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.596 225859 DEBUG nova.virt.libvirt.vif [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-20T15:45:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=222,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-5t8zkczs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-20T15:45:14Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=464c0661-0ddb-4794-8959-db066827326c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.597 225859 DEBUG nova.network.os_vif_util [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.598 225859 DEBUG nova.network.os_vif_util [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.598 225859 DEBUG os_vif [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.598 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.599 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.599 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.603 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.603 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d94aa0d-ed, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.603 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d94aa0d-ed, col_values=(('external_ids', {'iface-id': '2d94aa0d-ed38-41aa-9f34-5ed2a83a7304', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:b1:99', 'vm-uuid': '464c0661-0ddb-4794-8959-db066827326c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.605 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:18 np0005588919 NetworkManager[49104]: <info>  [1768923918.6060] manager: (tap2d94aa0d-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/432)
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.607 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.614 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.615 225859 INFO os_vif [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed')#033[00m
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.676396) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918676519, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 2362, "num_deletes": 251, "total_data_size": 5877047, "memory_usage": 5943728, "flush_reason": "Manual Compaction"}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918705311, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 3835390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 87149, "largest_seqno": 89505, "table_properties": {"data_size": 3825792, "index_size": 6091, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19644, "raw_average_key_size": 20, "raw_value_size": 3806783, "raw_average_value_size": 3953, "num_data_blocks": 265, "num_entries": 963, "num_filter_entries": 963, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923702, "oldest_key_time": 1768923702, "file_creation_time": 1768923918, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 28974 microseconds, and 8272 cpu microseconds.
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.705377) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 3835390 bytes OK
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.705406) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707404) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707420) EVENT_LOG_v1 {"time_micros": 1768923918707415, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.707443) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 5866682, prev total WAL file size 5866682, number of live WAL files 2.
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.709081) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(3745KB)], [180(12MB)]
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918709122, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 16460285, "oldest_snapshot_seqno": -1}
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.746 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.746 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.747 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] No VIF found with MAC fa:16:3e:87:b1:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.747 225859 INFO nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Using config drive#033[00m
Jan 20 10:45:18 np0005588919 nova_compute[225855]: 2026-01-20 15:45:18.774 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 11034 keys, 14482290 bytes, temperature: kUnknown
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918829995, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 14482290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14410941, "index_size": 42628, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27653, "raw_key_size": 290303, "raw_average_key_size": 26, "raw_value_size": 14217898, "raw_average_value_size": 1288, "num_data_blocks": 1625, "num_entries": 11034, "num_filter_entries": 11034, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923918, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.830242) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 14482290 bytes
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.831416) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.1 rd, 119.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.0 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(8.1) write-amplify(3.8) OK, records in: 11553, records dropped: 519 output_compression: NoCompression
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.831432) EVENT_LOG_v1 {"time_micros": 1768923918831424, "job": 116, "event": "compaction_finished", "compaction_time_micros": 120961, "compaction_time_cpu_micros": 37750, "output_level": 6, "num_output_files": 1, "total_output_size": 14482290, "num_input_records": 11553, "num_output_records": 11034, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918832317, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923918834561, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.708992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.834742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.834749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.834751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.834753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:18 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:45:18.834754) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:45:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:19.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:19 np0005588919 nova_compute[225855]: 2026-01-20 15:45:19.838 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:19.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.208 225859 INFO nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Creating config drive at /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.213 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0yndlp77 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.344 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0yndlp77" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.381 225859 DEBUG nova.storage.rbd_utils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] rbd image 464c0661-0ddb-4794-8959-db066827326c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.385 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config 464c0661-0ddb-4794-8959-db066827326c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.572 225859 DEBUG oslo_concurrency.processutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config 464c0661-0ddb-4794-8959-db066827326c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.573 225859 INFO nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Deleting local config drive /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c/disk.config because it was imported into RBD.#033[00m
Jan 20 10:45:20 np0005588919 kernel: tap2d94aa0d-ed: entered promiscuous mode
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.6297] manager: (tap2d94aa0d-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/433)
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.631 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:20Z|01000|binding|INFO|Claiming lport 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 for this chassis.
Jan 20 10:45:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:20Z|01001|binding|INFO|2d94aa0d-ed38-41aa-9f34-5ed2a83a7304: Claiming fa:16:3e:87:b1:99 10.100.0.7
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.635 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.642 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.6432] manager: (patch-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.6441] manager: (patch-br-int-to-provnet-b62c391b-f7a3-4a38-a0df-72ac0383ca74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/435)
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.649 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b1:99 10.100.0.7'], port_security=['fa:16:3e:87:b1:99 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '464c0661-0ddb-4794-8959-db066827326c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '133c0593-3211-4540-bb4e-2efa6f05d67f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.651 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 bound to our chassis#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.653 140354 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6567de92-725d-4dcc-97c2-0fec6d9bda84#033[00m
Jan 20 10:45:20 np0005588919 systemd-udevd[330913]: Network interface NamePolicy= disabled on kernel command line.
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.668 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8c52e5-16ee-4f9d-9db7-a38b5a8d72a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.669 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6567de92-71 in ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.670 229707 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6567de92-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.670 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[229fe9df-f40c-4ec2-a379-54a410698243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 systemd-machined[194361]: New machine qemu-115-instance-000000de.
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.671 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bc705c-55c5-44f7-a90f-cadc2be9aae1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.6861] device (tap2d94aa0d-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.6874] device (tap2d94aa0d-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.689 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[374d504f-26c6-4593-ab86-c81d863833d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 systemd[1]: Started Virtual Machine qemu-115-instance-000000de.
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.717 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[22192256-50e4-475f-a715-56ae870367a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.723 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.734 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:20Z|01002|binding|INFO|Setting lport 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 ovn-installed in OVS
Jan 20 10:45:20 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:20Z|01003|binding|INFO|Setting lport 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 up in Southbound
Jan 20 10:45:20 np0005588919 nova_compute[225855]: 2026-01-20 15:45:20.746 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.755 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[853e0913-8921-47d1-9ff8-ebbe00aa4f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.7614] manager: (tap6567de92-70): new Veth device (/org/freedesktop/NetworkManager/Devices/436)
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.762 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[da3bcb1f-2c16-4eab-b7ac-843c8fa87237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.804 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[9a176663-3340-440a-bc6a-b316be82439d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.807 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[8792c5de-ca16-4d18-aac3-7725ae1b3ee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 NetworkManager[49104]: <info>  [1768923920.8300] device (tap6567de92-70): carrier: link connected
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.835 229764 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccf7397-1a91-40be-805d-69b81a254dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.850 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[3528d836-2fac-48b9-9de8-ae4f778728b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6567de92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:26:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895975, 'reachable_time': 24803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330946, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.868 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed846bf-6c81-4399-a22e-f92b582915dd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:2665'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 895975, 'tstamp': 895975}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 330947, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.884 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1f2273-3c87-4bd5-bf24-fb9604fd58df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6567de92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:26:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895975, 'reachable_time': 24803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 330948, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.918 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[1ded0c26-4f0e-43d5-82b4-11a283461098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.981 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[faa0c7bc-55be-4dd8-93a6-a126970744fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.983 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6567de92-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.983 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 20 10:45:20 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:20.984 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6567de92-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:21 np0005588919 NetworkManager[49104]: <info>  [1768923921.0304] manager: (tap6567de92-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Jan 20 10:45:21 np0005588919 kernel: tap6567de92-70: entered promiscuous mode
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:21.034 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6567de92-70, col_values=(('external_ids', {'iface-id': '49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.035 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:21 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:21Z|01004|binding|INFO|Releasing lport 49b3575c-9e2a-4ac6-bd85-e7d639dfd6e3 from this chassis (sb_readonly=0)
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.049 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:21.049 140354 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:21.050 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc632bf-6794-4fb0-837a-d0ff810e8c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:21.051 140354 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: global
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    log         /dev/log local0 debug
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    log-tag     haproxy-metadata-proxy-6567de92-725d-4dcc-97c2-0fec6d9bda84
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    user        root
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    group       root
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    maxconn     1024
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    pidfile     /var/lib/neutron/external/pids/6567de92-725d-4dcc-97c2-0fec6d9bda84.pid.haproxy
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    daemon
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: defaults
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    log global
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    mode http
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    option httplog
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    option dontlognull
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    option http-server-close
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    option forwardfor
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    retries                 3
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    timeout http-request    30s
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    timeout connect         30s
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    timeout client          32s
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    timeout server          32s
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    timeout http-keep-alive 30s
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: listen listener
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    bind 169.254.169.254:80
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    server metadata /var/lib/neutron/metadata_proxy
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]:    http-request add-header X-OVN-Network-ID 6567de92-725d-4dcc-97c2-0fec6d9bda84
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 20 10:45:21 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:21.051 140354 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'env', 'PROCESS_TAG=haproxy-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6567de92-725d-4dcc-97c2-0fec6d9bda84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.151 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923921.150042, 464c0661-0ddb-4794-8959-db066827326c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.151 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] VM Started (Lifecycle Event)#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.182 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.186 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923921.1507576, 464c0661-0ddb-4794-8959-db066827326c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.186 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] VM Paused (Lifecycle Event)#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.210 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.213 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.234 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.249 225859 DEBUG nova.network.neutron [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updated VIF entry in instance network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.249 225859 DEBUG nova.network.neutron [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.274 225859 DEBUG oslo_concurrency.lockutils [req-41268f81-84af-4d68-b440-89abcbfe12ee req-d5b71cc0-eeae-4bf0-89d6-f004f7f62779 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:21 np0005588919 podman[331022]: 2026-01-20 15:45:21.387781278 +0000 UTC m=+0.049535240 container create 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 20 10:45:21 np0005588919 systemd[1]: Started libpod-conmon-492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f.scope.
Jan 20 10:45:21 np0005588919 systemd[1]: Started libcrun container.
Jan 20 10:45:21 np0005588919 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4351c11d91150c48202686c6f66728d5abd4164b417977ca124c1d87a5582683/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 20 10:45:21 np0005588919 podman[331022]: 2026-01-20 15:45:21.358962604 +0000 UTC m=+0.020716606 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 20 10:45:21 np0005588919 podman[331022]: 2026-01-20 15:45:21.46709406 +0000 UTC m=+0.128848042 container init 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:45:21 np0005588919 podman[331022]: 2026-01-20 15:45:21.475179988 +0000 UTC m=+0.136933950 container start 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.493 225859 DEBUG nova.compute.manager [req-9a985f7a-329d-410f-a407-b7fa3b65256e req-c9cc979e-3a06-48c5-86d9-9728ab71172b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.493 225859 DEBUG oslo_concurrency.lockutils [req-9a985f7a-329d-410f-a407-b7fa3b65256e req-c9cc979e-3a06-48c5-86d9-9728ab71172b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.494 225859 DEBUG oslo_concurrency.lockutils [req-9a985f7a-329d-410f-a407-b7fa3b65256e req-c9cc979e-3a06-48c5-86d9-9728ab71172b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.494 225859 DEBUG oslo_concurrency.lockutils [req-9a985f7a-329d-410f-a407-b7fa3b65256e req-c9cc979e-3a06-48c5-86d9-9728ab71172b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.495 225859 DEBUG nova.compute.manager [req-9a985f7a-329d-410f-a407-b7fa3b65256e req-c9cc979e-3a06-48c5-86d9-9728ab71172b 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Processing event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.495 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.500 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 20 10:45:21 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [NOTICE]   (331041) : New worker (331043) forked
Jan 20 10:45:21 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [NOTICE]   (331041) : Loading success.
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.511 225859 DEBUG nova.virt.driver [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] Emitting event <LifecycleEvent: 1768923921.5107138, 464c0661-0ddb-4794-8959-db066827326c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.512 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] VM Resumed (Lifecycle Event)#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.516 225859 INFO nova.virt.libvirt.driver [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] Instance spawned successfully.#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.517 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.540 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.546 225859 DEBUG nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.549 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.549 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.550 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.550 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.551 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.551 225859 DEBUG nova.virt.libvirt.driver [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.581 225859 INFO nova.compute.manager [None req-400328df-8b8f-4375-8546-3040b3d2f21e - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 20 10:45:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:21.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.623 225859 INFO nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Took 7.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.624 225859 DEBUG nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.684 225859 INFO nova.compute.manager [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Took 8.55 seconds to build instance.#033[00m
Jan 20 10:45:21 np0005588919 nova_compute[225855]: 2026-01-20 15:45:21.715 225859 DEBUG oslo_concurrency.lockutils [None req-240e01b4-a824-470a-a8e7-b30c4e45ae3f 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:21.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:22 np0005588919 nova_compute[225855]: 2026-01-20 15:45:22.506 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:22.505 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:22 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:22.507 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.573 225859 DEBUG nova.compute.manager [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.573 225859 DEBUG oslo_concurrency.lockutils [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.573 225859 DEBUG oslo_concurrency.lockutils [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.573 225859 DEBUG oslo_concurrency.lockutils [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.574 225859 DEBUG nova.compute.manager [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] No waiting events found dispatching network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.574 225859 WARNING nova.compute.manager [req-1b8e6d7f-fd58-475c-8f92-a8c609105f82 req-a714cb57-d29b-4c95-a226-0d8e89884282 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received unexpected event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 for instance with vm_state active and task_state None.#033[00m
Jan 20 10:45:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:23.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:23 np0005588919 nova_compute[225855]: 2026-01-20 15:45:23.606 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:23.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:24 np0005588919 nova_compute[225855]: 2026-01-20 15:45:24.900 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:25.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:25.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.357 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.357 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.357 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:45:26 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:26.509 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.583 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.584 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquired lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.584 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 20 10:45:26 np0005588919 nova_compute[225855]: 2026-01-20 15:45:26.584 225859 DEBUG nova.objects.instance [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 464c0661-0ddb-4794-8959-db066827326c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:27 np0005588919 nova_compute[225855]: 2026-01-20 15:45:27.374 225859 DEBUG nova.compute.manager [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:27 np0005588919 nova_compute[225855]: 2026-01-20 15:45:27.374 225859 DEBUG nova.compute.manager [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing instance network info cache due to event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:45:27 np0005588919 nova_compute[225855]: 2026-01-20 15:45:27.375 225859 DEBUG oslo_concurrency.lockutils [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:27.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:27.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.608 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.714 225859 DEBUG nova.network.neutron [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.737 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Releasing lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.738 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.739 225859 DEBUG oslo_concurrency.lockutils [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:28 np0005588919 nova_compute[225855]: 2026-01-20 15:45:28.739 225859 DEBUG nova.network.neutron [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:45:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.456 225859 DEBUG nova.compute.manager [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.457 225859 DEBUG nova.compute.manager [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing instance network info cache due to event network-changed-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.457 225859 DEBUG oslo_concurrency.lockutils [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 20 10:45:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:29.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:29 np0005588919 nova_compute[225855]: 2026-01-20 15:45:29.903 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:29.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:30 np0005588919 nova_compute[225855]: 2026-01-20 15:45:30.643 225859 DEBUG nova.network.neutron [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updated VIF entry in instance network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:45:30 np0005588919 nova_compute[225855]: 2026-01-20 15:45:30.644 225859 DEBUG nova.network.neutron [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:30 np0005588919 nova_compute[225855]: 2026-01-20 15:45:30.661 225859 DEBUG oslo_concurrency.lockutils [req-40492224-b9c5-40f2-bfe7-4ce6d817ce37 req-e0f43cad-c23e-4d4a-803b-e112bbc2f7c2 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:30 np0005588919 nova_compute[225855]: 2026-01-20 15:45:30.663 225859 DEBUG oslo_concurrency.lockutils [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquired lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 20 10:45:30 np0005588919 nova_compute[225855]: 2026-01-20 15:45:30.663 225859 DEBUG nova.network.neutron [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Refreshing network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 20 10:45:31 np0005588919 nova_compute[225855]: 2026-01-20 15:45:31.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:31.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:31.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.364 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.365 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.365 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.365 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.366 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.609 225859 DEBUG nova.network.neutron [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updated VIF entry in instance network info cache for port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.610 225859 DEBUG nova.network.neutron [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [{"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.626 225859 DEBUG oslo_concurrency.lockutils [req-50306de1-5986-4c5d-865b-29e100f9905d req-a18d3ac1-e6b8-4177-8688-93a87c57bc15 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Releasing lock "refresh_cache-464c0661-0ddb-4794-8959-db066827326c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 20 10:45:32 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:32 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2225520846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.845 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.932 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000de as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:45:32 np0005588919 nova_compute[225855]: 2026-01-20 15:45:32.933 225859 DEBUG nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] skipping disk for instance-000000de as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.102 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.103 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3984MB free_disk=20.921802520751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.104 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.104 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.206 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Instance 464c0661-0ddb-4794-8959-db066827326c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.207 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.207 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.238 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:33.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.611 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:33 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:33 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/740262342' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.705 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.711 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.743 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.761 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:45:33 np0005588919 nova_compute[225855]: 2026-01-20 15:45:33.761 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:33.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:34 np0005588919 nova_compute[225855]: 2026-01-20 15:45:34.761 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:34 np0005588919 nova_compute[225855]: 2026-01-20 15:45:34.761 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:34 np0005588919 nova_compute[225855]: 2026-01-20 15:45:34.761 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:34 np0005588919 nova_compute[225855]: 2026-01-20 15:45:34.942 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:35Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:b1:99 10.100.0.7
Jan 20 10:45:35 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:35Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:b1:99 10.100.0.7
Jan 20 10:45:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:35.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:35.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:37.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:38 np0005588919 podman[331157]: 2026-01-20 15:45:38.060253885 +0000 UTC m=+0.092208347 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 20 10:45:38 np0005588919 nova_compute[225855]: 2026-01-20 15:45:38.669 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:39 np0005588919 nova_compute[225855]: 2026-01-20 15:45:39.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:39.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:39.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:39 np0005588919 nova_compute[225855]: 2026-01-20 15:45:39.949 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:41.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:41.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.332 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.332 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.332 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.333 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.333 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.334 225859 INFO nova.compute.manager [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Terminating instance#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.335 225859 DEBUG nova.compute.manager [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 20 10:45:42 np0005588919 kernel: tap2d94aa0d-ed (unregistering): left promiscuous mode
Jan 20 10:45:42 np0005588919 NetworkManager[49104]: <info>  [1768923942.3917] device (tap2d94aa0d-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 20 10:45:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:42Z|01005|binding|INFO|Releasing lport 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 from this chassis (sb_readonly=0)
Jan 20 10:45:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:42Z|01006|binding|INFO|Setting lport 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 down in Southbound
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.402 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 ovn_controller[130490]: 2026-01-20T15:45:42Z|01007|binding|INFO|Removing iface tap2d94aa0d-ed ovn-installed in OVS
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.404 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.414 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:b1:99 10.100.0.7', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '464c0661-0ddb-4794-8959-db066827326c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '728662ec7f654a3fb2e53a90b8707d7e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92670689-434c-4ed8-a2e4-6278a7d19616, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>], logical_port=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb671582ac0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.415 140354 INFO neutron.agent.ovn.metadata.agent [-] Port 2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 in datapath 6567de92-725d-4dcc-97c2-0fec6d9bda84 unbound from our chassis#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.416 140354 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6567de92-725d-4dcc-97c2-0fec6d9bda84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.417 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cba7414d-7022-44f4-9538-5296240846ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.419 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.418 140354 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 namespace which is not needed anymore#033[00m
Jan 20 10:45:42 np0005588919 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d000000de.scope: Deactivated successfully.
Jan 20 10:45:42 np0005588919 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d000000de.scope: Consumed 13.524s CPU time.
Jan 20 10:45:42 np0005588919 systemd-machined[194361]: Machine qemu-115-instance-000000de terminated.
Jan 20 10:45:42 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [NOTICE]   (331041) : haproxy version is 2.8.14-c23fe91
Jan 20 10:45:42 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [NOTICE]   (331041) : path to executable is /usr/sbin/haproxy
Jan 20 10:45:42 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [WARNING]  (331041) : Exiting Master process...
Jan 20 10:45:42 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [ALERT]    (331041) : Current worker (331043) exited with code 143 (Terminated)
Jan 20 10:45:42 np0005588919 neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84[331037]: [WARNING]  (331041) : All workers exited. Exiting... (0)
Jan 20 10:45:42 np0005588919 systemd[1]: libpod-492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f.scope: Deactivated successfully.
Jan 20 10:45:42 np0005588919 podman[331209]: 2026-01-20 15:45:42.547083229 +0000 UTC m=+0.045061704 container died 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.557 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.563 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.569 225859 INFO nova.virt.libvirt.driver [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] Instance destroyed successfully.#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.570 225859 DEBUG nova.objects.instance [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lazy-loading 'resources' on Instance uuid 464c0661-0ddb-4794-8959-db066827326c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 20 10:45:42 np0005588919 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f-userdata-shm.mount: Deactivated successfully.
Jan 20 10:45:42 np0005588919 systemd[1]: var-lib-containers-storage-overlay-4351c11d91150c48202686c6f66728d5abd4164b417977ca124c1d87a5582683-merged.mount: Deactivated successfully.
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.592 225859 DEBUG nova.virt.libvirt.vif [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-20T15:45:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-342561427-gen-1-623022678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-342561427-gen',id=222,image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJW0RDNBQ7KP8Mqzxdg2i8X8upMhqABnnonEiTjmMv4W9RTdXxd1b3Z8QL9swZ0e0+6po4+8oM5PFrC0tn+WmJ7twYzqOI2QMeaFZC9+Q35AVwNQsKxl3WWPGvw1iSa1jA==',key_name='tempest-TestSecurityGroupsBasicOps-1661471182',keypairs=<?>,launch_index=0,launched_at=2026-01-20T15:45:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='728662ec7f654a3fb2e53a90b8707d7e',ramdisk_id='',reservation_id='r-5t8zkczs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a32b3e07-16d8-46fd-9a7b-c242c432fcf9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-342561427',owner_user_name='tempest-TestSecurityGroupsBasicOps-342561427-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-20T15:45:21Z,user_data=None,user_id='5985ef736503499a9f1d734cabc33ce5',uuid=464c0661-0ddb-4794-8959-db066827326c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 20 10:45:42 np0005588919 podman[331209]: 2026-01-20 15:45:42.593075879 +0000 UTC m=+0.091054344 container cleanup 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.593 225859 DEBUG nova.network.os_vif_util [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converting VIF {"id": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "address": "fa:16:3e:87:b1:99", "network": {"id": "6567de92-725d-4dcc-97c2-0fec6d9bda84", "bridge": "br-int", "label": "tempest-network-smoke--623656421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "728662ec7f654a3fb2e53a90b8707d7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d94aa0d-ed", "ovs_interfaceid": "2d94aa0d-ed38-41aa-9f34-5ed2a83a7304", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.594 225859 DEBUG nova.network.os_vif_util [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.594 225859 DEBUG os_vif [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.596 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.597 225859 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d94aa0d-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 systemd[1]: libpod-conmon-492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f.scope: Deactivated successfully.
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.603 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.606 225859 INFO os_vif [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:b1:99,bridge_name='br-int',has_traffic_filtering=True,id=2d94aa0d-ed38-41aa-9f34-5ed2a83a7304,network=Network(6567de92-725d-4dcc-97c2-0fec6d9bda84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d94aa0d-ed')#033[00m
Jan 20 10:45:42 np0005588919 podman[331250]: 2026-01-20 15:45:42.670314902 +0000 UTC m=+0.049929252 container remove 492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.678 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[617dfa11-9e94-47a2-bfe6-90be4ae56dc1]: (4, ('Tue Jan 20 03:45:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 (492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f)\n492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f\nTue Jan 20 03:45:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 (492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f)\n492a6f3905f2e7ce4d211c13d3bef1132fb971d16c227b376e5f4514c80ff23f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.680 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4cf084-741e-49f3-bde6-fbcbee11a88d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.681 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6567de92-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.682 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 kernel: tap6567de92-70: left promiscuous mode
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.700 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.703 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[93f86377-8053-4e16-b7f2-c5b18706a24d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.717 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[a4cac1ba-84c0-435b-badc-dccd54b528c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.719 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe5e427-e67f-436f-bb3d-8e23e86fd94a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.735 229707 DEBUG oslo.privsep.daemon [-] privsep: reply[e62ab6cd-0fe5-4adc-ae34-d60648b93021]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 895967, 'reachable_time': 41472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 331283, 'error': None, 'target': 'ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.738 140466 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6567de92-725d-4dcc-97c2-0fec6d9bda84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 20 10:45:42 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:45:42.738 140466 DEBUG oslo.privsep.daemon [-] privsep: reply[05697ccb-2a63-487b-8502-1f611ce134d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 20 10:45:42 np0005588919 systemd[1]: run-netns-ovnmeta\x2d6567de92\x2d725d\x2d4dcc\x2d97c2\x2d0fec6d9bda84.mount: Deactivated successfully.
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.986 225859 INFO nova.virt.libvirt.driver [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Deleting instance files /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c_del#033[00m
Jan 20 10:45:42 np0005588919 nova_compute[225855]: 2026-01-20 15:45:42.987 225859 INFO nova.virt.libvirt.driver [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Deletion of /var/lib/nova/instances/464c0661-0ddb-4794-8959-db066827326c_del complete#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.059 225859 INFO nova.compute.manager [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.060 225859 DEBUG oslo.service.loopingcall [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.061 225859 DEBUG nova.compute.manager [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.061 225859 DEBUG nova.network.neutron [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.214 225859 DEBUG nova.compute.manager [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-unplugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.214 225859 DEBUG oslo_concurrency.lockutils [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.215 225859 DEBUG oslo_concurrency.lockutils [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.215 225859 DEBUG oslo_concurrency.lockutils [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.215 225859 DEBUG nova.compute.manager [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] No waiting events found dispatching network-vif-unplugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:43 np0005588919 nova_compute[225855]: 2026-01-20 15:45:43.215 225859 DEBUG nova.compute.manager [req-d9604e77-d39d-40f5-8a23-ad474dc757ae req-8bdb0172-e6db-4fa6-b308-1e01b04074e1 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-unplugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 20 10:45:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:43.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:43.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.036 225859 DEBUG nova.network.neutron [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.053 225859 INFO nova.compute.manager [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] Took 0.99 seconds to deallocate network for instance.#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.108 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.108 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.111 225859 DEBUG nova.compute.manager [req-357eff8c-6f1f-4608-9502-70278042b0cb req-d92e821c-676d-46d2-ad19-a6dfba903a85 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-deleted-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.172 225859 DEBUG oslo_concurrency.processutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:45:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:45:44 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1327197156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.614 225859 DEBUG oslo_concurrency.processutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.620 225859 DEBUG nova.compute.provider_tree [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.644 225859 DEBUG nova.scheduler.client.report [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.673 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.711 225859 INFO nova.scheduler.client.report [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Deleted allocations for instance 464c0661-0ddb-4794-8959-db066827326c#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.785 225859 DEBUG oslo_concurrency.lockutils [None req-adaa13a9-ed09-47ae-af7e-ee4814c03d82 5985ef736503499a9f1d734cabc33ce5 728662ec7f654a3fb2e53a90b8707d7e - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:44 np0005588919 nova_compute[225855]: 2026-01-20 15:45:44.950 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:45 np0005588919 podman[331308]: 2026-01-20 15:45:45.01549313 +0000 UTC m=+0.053085331 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.322 225859 DEBUG nova.compute.manager [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.322 225859 DEBUG oslo_concurrency.lockutils [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Acquiring lock "464c0661-0ddb-4794-8959-db066827326c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.322 225859 DEBUG oslo_concurrency.lockutils [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.323 225859 DEBUG oslo_concurrency.lockutils [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] Lock "464c0661-0ddb-4794-8959-db066827326c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.323 225859 DEBUG nova.compute.manager [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] No waiting events found dispatching network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 20 10:45:45 np0005588919 nova_compute[225855]: 2026-01-20 15:45:45.323 225859 WARNING nova.compute.manager [req-13b38445-d8db-4fb9-b18b-7e315ab564d9 req-49bed99b-bb7d-4b86-a6d6-1d9aa9175568 15e2d293aecb44f4b8fadb4968d7c65b d5b132113da54ff6b616e719b9c45446 - - default default] [instance: 464c0661-0ddb-4794-8959-db066827326c] Received unexpected event network-vif-plugged-2d94aa0d-ed38-41aa-9f34-5ed2a83a7304 for instance with vm_state deleted and task_state None.#033[00m
Jan 20 10:45:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:45.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:45.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:47 np0005588919 nova_compute[225855]: 2026-01-20 15:45:47.600 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:47.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:47.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:49.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:49.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:49 np0005588919 nova_compute[225855]: 2026-01-20 15:45:49.952 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:51.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:51.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:52 np0005588919 nova_compute[225855]: 2026-01-20 15:45:52.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:53.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:53.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:54 np0005588919 nova_compute[225855]: 2026-01-20 15:45:54.954 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:55.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:55.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:55 np0005588919 nova_compute[225855]: 2026-01-20 15:45:55.968 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:55 np0005588919 nova_compute[225855]: 2026-01-20 15:45:55.985 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:57 np0005588919 nova_compute[225855]: 2026-01-20 15:45:57.568 225859 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768923942.5672007, 464c0661-0ddb-4794-8959-db066827326c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 20 10:45:57 np0005588919 nova_compute[225855]: 2026-01-20 15:45:57.568 225859 INFO nova.compute.manager [-] [instance: 464c0661-0ddb-4794-8959-db066827326c] VM Stopped (Lifecycle Event)#033[00m
Jan 20 10:45:57 np0005588919 nova_compute[225855]: 2026-01-20 15:45:57.602 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:45:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:57.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:45:57 np0005588919 nova_compute[225855]: 2026-01-20 15:45:57.852 225859 DEBUG nova.compute.manager [None req-6e006aaa-8dc1-4de1-be25-613d4a75da6a - - - - - -] [instance: 464c0661-0ddb-4794-8959-db066827326c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 20 10:45:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:45:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:57.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:45:58 np0005588919 nova_compute[225855]: 2026-01-20 15:45:58.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:45:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:45:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:45:59.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:45:59 np0005588919 nova_compute[225855]: 2026-01-20 15:45:59.955 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:45:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:45:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:45:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:45:59.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:01.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:01.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:02 np0005588919 nova_compute[225855]: 2026-01-20 15:46:02.604 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:03.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:03.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:04 np0005588919 nova_compute[225855]: 2026-01-20 15:46:04.957 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:05.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:05.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:07 np0005588919 nova_compute[225855]: 2026-01-20 15:46:07.606 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:07.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:07.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.556955) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968557075, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 686, "num_deletes": 250, "total_data_size": 1245577, "memory_usage": 1266536, "flush_reason": "Manual Compaction"}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968566236, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 531405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 89510, "largest_seqno": 90191, "table_properties": {"data_size": 528539, "index_size": 837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7711, "raw_average_key_size": 20, "raw_value_size": 522605, "raw_average_value_size": 1382, "num_data_blocks": 38, "num_entries": 378, "num_filter_entries": 378, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923919, "oldest_key_time": 1768923919, "file_creation_time": 1768923968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 9336 microseconds, and 5253 cpu microseconds.
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.566289) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 531405 bytes OK
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.566316) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.567930) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.567951) EVENT_LOG_v1 {"time_micros": 1768923968567944, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.567975) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 1241881, prev total WAL file size 1241881, number of live WAL files 2.
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.568855) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303039' seq:72057594037927935, type:22 .. '6D6772737461740033323630' seq:0, type:0; will stop at (end)
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(518KB)], [183(13MB)]
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968568941, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 15013695, "oldest_snapshot_seqno": -1}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10922 keys, 11470784 bytes, temperature: kUnknown
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968702480, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11470784, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11404453, "index_size": 37930, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 288150, "raw_average_key_size": 26, "raw_value_size": 11217508, "raw_average_value_size": 1027, "num_data_blocks": 1430, "num_entries": 10922, "num_filter_entries": 10922, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768923968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.702999) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11470784 bytes
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.704464) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 112.3 rd, 85.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 13.8 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(49.8) write-amplify(21.6) OK, records in: 11412, records dropped: 490 output_compression: NoCompression
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.704498) EVENT_LOG_v1 {"time_micros": 1768923968704484, "job": 118, "event": "compaction_finished", "compaction_time_micros": 133686, "compaction_time_cpu_micros": 54301, "output_level": 6, "num_output_files": 1, "total_output_size": 11470784, "num_input_records": 11412, "num_output_records": 10922, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968705162, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768923968710695, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.568758) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.710858) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.710882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.710884) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.710887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:46:08.710889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:46:09 np0005588919 podman[331391]: 2026-01-20 15:46:09.074997435 +0000 UTC m=+0.120098775 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 20 10:46:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:09.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:09 np0005588919 nova_compute[225855]: 2026-01-20 15:46:09.960 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:09.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:11.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:11.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:12 np0005588919 nova_compute[225855]: 2026-01-20 15:46:12.609 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:13.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:13.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:14 np0005588919 nova_compute[225855]: 2026-01-20 15:46:14.961 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:15.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:15.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:16 np0005588919 podman[331471]: 2026-01-20 15:46:16.011084678 +0000 UTC m=+0.058614598 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 10:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:46:16.467 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:46:16.468 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:46:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:46:16.468 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:46:17 np0005588919 nova_compute[225855]: 2026-01-20 15:46:17.610 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:17.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:17.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:46:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:19 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:46:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:19.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:19 np0005588919 nova_compute[225855]: 2026-01-20 15:46:19.964 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:20.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:21.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:22.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:22 np0005588919 nova_compute[225855]: 2026-01-20 15:46:22.611 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:24.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:24 np0005588919 nova_compute[225855]: 2026-01-20 15:46:24.966 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:25.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:26.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:26 np0005588919 nova_compute[225855]: 2026-01-20 15:46:26.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:46:26 np0005588919 nova_compute[225855]: 2026-01-20 15:46:26.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:46:26 np0005588919 nova_compute[225855]: 2026-01-20 15:46:26.339 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:46:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:26 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:46:27 np0005588919 ovn_controller[130490]: 2026-01-20T15:46:27Z|01008|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Jan 20 10:46:27 np0005588919 nova_compute[225855]: 2026-01-20 15:46:27.645 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:27.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:46:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:28.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:46:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:29.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:29 np0005588919 nova_compute[225855]: 2026-01-20 15:46:29.969 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:30.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:31.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:32.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:32 np0005588919 nova_compute[225855]: 2026-01-20 15:46:32.708 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:33.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:34.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:34 np0005588919 nova_compute[225855]: 2026-01-20 15:46:34.971 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:35.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:36.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:37.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:37 np0005588919 nova_compute[225855]: 2026-01-20 15:46:37.712 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:38.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:39.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:40 np0005588919 nova_compute[225855]: 2026-01-20 15:46:40.014 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:40.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:40 np0005588919 podman[331736]: 2026-01-20 15:46:40.085812584 +0000 UTC m=+0.132421603 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:46:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:41.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:42.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:42 np0005588919 nova_compute[225855]: 2026-01-20 15:46:42.717 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:43.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:44.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:45 np0005588919 nova_compute[225855]: 2026-01-20 15:46:45.016 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:45.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:46.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:47 np0005588919 podman[331766]: 2026-01-20 15:46:47.068652588 +0000 UTC m=+0.106014997 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 20 10:46:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:47.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:47 np0005588919 nova_compute[225855]: 2026-01-20 15:46:47.763 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:48.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:49.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:50 np0005588919 nova_compute[225855]: 2026-01-20 15:46:50.017 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:50.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:51.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:52.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:52 np0005588919 nova_compute[225855]: 2026-01-20 15:46:52.813 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:53.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:54.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:55 np0005588919 nova_compute[225855]: 2026-01-20 15:46:55.020 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:46:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:55.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:46:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:56.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:57.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:57 np0005588919 nova_compute[225855]: 2026-01-20 15:46:57.815 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:46:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:46:58.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:46:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:46:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:46:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:46:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:46:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:00 np0005588919 nova_compute[225855]: 2026-01-20 15:47:00.021 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:00.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:01.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:02.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:02 np0005588919 nova_compute[225855]: 2026-01-20 15:47:02.858 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:03.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:04.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:05 np0005588919 nova_compute[225855]: 2026-01-20 15:47:05.068 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:05.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:06.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:07.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:07 np0005588919 nova_compute[225855]: 2026-01-20 15:47:07.861 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:08.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.577481) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028577571, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 805, "num_deletes": 254, "total_data_size": 1549312, "memory_usage": 1576960, "flush_reason": "Manual Compaction"}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028588691, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1022693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90196, "largest_seqno": 90996, "table_properties": {"data_size": 1018873, "index_size": 1599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8317, "raw_average_key_size": 18, "raw_value_size": 1011251, "raw_average_value_size": 2298, "num_data_blocks": 71, "num_entries": 440, "num_filter_entries": 440, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768923968, "oldest_key_time": 1768923968, "file_creation_time": 1768924028, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 11242 microseconds, and 4441 cpu microseconds.
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.588737) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1022693 bytes OK
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.588756) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.590722) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.590772) EVENT_LOG_v1 {"time_micros": 1768924028590762, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.590801) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 1545139, prev total WAL file size 1545139, number of live WAL files 2.
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591623) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353233' seq:72057594037927935, type:22 .. '6C6F676D0033373734' seq:0, type:0; will stop at (end)
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(998KB)], [186(10MB)]
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028591670, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12493477, "oldest_snapshot_seqno": -1}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10842 keys, 12373635 bytes, temperature: kUnknown
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028673182, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12373635, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12306393, "index_size": 39033, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 287390, "raw_average_key_size": 26, "raw_value_size": 12119494, "raw_average_value_size": 1117, "num_data_blocks": 1475, "num_entries": 10842, "num_filter_entries": 10842, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768924028, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.673435) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12373635 bytes
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.675083) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.1 rd, 151.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 10.9 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(24.3) write-amplify(12.1) OK, records in: 11362, records dropped: 520 output_compression: NoCompression
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.675100) EVENT_LOG_v1 {"time_micros": 1768924028675091, "job": 120, "event": "compaction_finished", "compaction_time_micros": 81586, "compaction_time_cpu_micros": 28746, "output_level": 6, "num_output_files": 1, "total_output_size": 12373635, "num_input_records": 11362, "num_output_records": 10842, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028675427, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924028677287, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.591558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.677416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.677423) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.677425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.677427) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:08 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:47:08.677429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:47:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:09.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:10 np0005588919 nova_compute[225855]: 2026-01-20 15:47:10.071 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:10.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:11 np0005588919 podman[331847]: 2026-01-20 15:47:11.044590852 +0000 UTC m=+0.087283747 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 20 10:47:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:11.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:12.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:12 np0005588919 nova_compute[225855]: 2026-01-20 15:47:12.898 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:13.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:14.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:15 np0005588919 nova_compute[225855]: 2026-01-20 15:47:15.072 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:16.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:16.469 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:16.469 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:16.469 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.362 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.362 225859 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 60.83 sec#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.362 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.364 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.364 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.364 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.365 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.365 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.365 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.365 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.507 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.508 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.508 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.508 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.508 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:17.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2951300976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.928 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:17 np0005588919 nova_compute[225855]: 2026-01-20 15:47:17.947 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:18 np0005588919 podman[331951]: 2026-01-20 15:47:18.007962166 +0000 UTC m=+0.050782457 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 20 10:47:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:18.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.126 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.127 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4255MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.128 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.128 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:18.419 140354 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '12:bb:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '06:92:24:f7:15:56'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.420 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:18 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:18.420 140354 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.448 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.449 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:47:18 np0005588919 nova_compute[225855]: 2026-01-20 15:47:18.589 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2122862983' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:19 np0005588919 nova_compute[225855]: 2026-01-20 15:47:19.004 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:19 np0005588919 nova_compute[225855]: 2026-01-20 15:47:19.012 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:47:19 np0005588919 nova_compute[225855]: 2026-01-20 15:47:19.048 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:47:19 np0005588919 nova_compute[225855]: 2026-01-20 15:47:19.077 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:47:19 np0005588919 nova_compute[225855]: 2026-01-20 15:47:19.077 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:19.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:20 np0005588919 nova_compute[225855]: 2026-01-20 15:47:20.074 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:20.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:21.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:22.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:47:22 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 69K writes, 268K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.04 MB/s#012Cumulative WAL: 69K writes, 26K syncs, 2.63 writes per sync, written: 0.26 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2415 writes, 9981 keys, 2415 commit groups, 1.0 writes per commit group, ingest: 10.97 MB, 0.02 MB/s#012Interval WAL: 2415 writes, 949 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:47:22 np0005588919 nova_compute[225855]: 2026-01-20 15:47:22.978 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:23.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:24.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:25 np0005588919 nova_compute[225855]: 2026-01-20 15:47:25.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:25.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:26.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2959745194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:27 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:47:27.422 140354 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5ffd4ac3-9266-4927-98ad-20a17782c725, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 20 10:47:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:47:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:27 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:47:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:27.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:27 np0005588919 nova_compute[225855]: 2026-01-20 15:47:27.979 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:28.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:29.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:30 np0005588919 nova_compute[225855]: 2026-01-20 15:47:30.075 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:30.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:31.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:32 np0005588919 nova_compute[225855]: 2026-01-20 15:47:32.074 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:32 np0005588919 nova_compute[225855]: 2026-01-20 15:47:32.074 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:32 np0005588919 nova_compute[225855]: 2026-01-20 15:47:32.075 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:47:32 np0005588919 nova_compute[225855]: 2026-01-20 15:47:32.075 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:47:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:32.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:32 np0005588919 nova_compute[225855]: 2026-01-20 15:47:32.983 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:33.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:33 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:47:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:34.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.551 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.552 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.553 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.553 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.554 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.554 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.554 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.587 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.588 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.589 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.589 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:47:34 np0005588919 nova_compute[225855]: 2026-01-20 15:47:34.590 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:35 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:35 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3096368661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.064 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.078 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.209 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.210 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4260MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.210 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:47:35 np0005588919 nova_compute[225855]: 2026-01-20 15:47:35.211 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:47:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:35.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:36.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:36 np0005588919 nova_compute[225855]: 2026-01-20 15:47:36.976 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:47:36 np0005588919 nova_compute[225855]: 2026-01-20 15:47:36.977 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.019 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:47:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:47:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2258595412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.709 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.690s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.714 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.731 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.733 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.733 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:47:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:37.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:37 np0005588919 nova_compute[225855]: 2026-01-20 15:47:37.986 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:38.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:39 np0005588919 nova_compute[225855]: 2026-01-20 15:47:39.519 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:39 np0005588919 nova_compute[225855]: 2026-01-20 15:47:39.520 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:39.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:40 np0005588919 nova_compute[225855]: 2026-01-20 15:47:40.079 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:40.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:41 np0005588919 nova_compute[225855]: 2026-01-20 15:47:41.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:41.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:42 np0005588919 podman[332280]: 2026-01-20 15:47:42.046900301 +0000 UTC m=+0.088145302 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:47:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:42.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:43 np0005588919 nova_compute[225855]: 2026-01-20 15:47:43.026 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:43.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:44.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:45 np0005588919 nova_compute[225855]: 2026-01-20 15:47:45.080 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:45.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:46.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:47.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:48 np0005588919 nova_compute[225855]: 2026-01-20 15:47:48.030 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:48.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:49 np0005588919 podman[332310]: 2026-01-20 15:47:49.02107224 +0000 UTC m=+0.067400566 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 20 10:47:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:49.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:50 np0005588919 nova_compute[225855]: 2026-01-20 15:47:50.081 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:47:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:50.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:47:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:47:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:51.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:47:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:52.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:53 np0005588919 nova_compute[225855]: 2026-01-20 15:47:53.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:53.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:54.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:47:54 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 18K writes, 91K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1462 writes, 7250 keys, 1462 commit groups, 1.0 writes per commit group, ingest: 15.21 MB, 0.03 MB/s#012Interval WAL: 1462 writes, 1462 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     72.9      1.53              0.38        60    0.025       0      0       0.0       0.0#012  L6      1/0   11.80 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.5     96.5     83.0      7.38              1.96        59    0.125    474K    31K       0.0       0.0#012 Sum      1/0   11.80 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.5     79.9     81.2      8.91              2.34       119    0.075    474K    31K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.8     81.1     80.8      0.99              0.25        12    0.082     67K   3081       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0     96.5     83.0      7.38              1.96        59    0.125    474K    31K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     73.0      1.53              0.38        59    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.109, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.71 GB write, 0.11 MB/s write, 0.70 GB read, 0.11 MB/s read, 8.9 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564d515a71f0#2 capacity: 304.00 MB usage: 76.40 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.000442 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4357,73.06 MB,24.0343%) FilterBlock(119,1.27 MB,0.417664%) IndexBlock(119,2.07 MB,0.681179%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 20 10:47:55 np0005588919 nova_compute[225855]: 2026-01-20 15:47:55.084 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:55.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:56.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:57.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:58 np0005588919 nova_compute[225855]: 2026-01-20 15:47:58.093 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:47:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:47:58.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:47:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:47:59 np0005588919 nova_compute[225855]: 2026-01-20 15:47:59.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:47:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:47:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:47:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:47:59.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:00 np0005588919 nova_compute[225855]: 2026-01-20 15:48:00.085 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:00.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:01.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:02.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:03 np0005588919 nova_compute[225855]: 2026-01-20 15:48:03.101 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:03.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:04.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:05 np0005588919 nova_compute[225855]: 2026-01-20 15:48:05.087 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:05.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:06.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:07.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:08 np0005588919 nova_compute[225855]: 2026-01-20 15:48:08.103 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:08.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:09.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:10 np0005588919 nova_compute[225855]: 2026-01-20 15:48:10.089 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:10.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:12.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:13 np0005588919 podman[332392]: 2026-01-20 15:48:13.020645949 +0000 UTC m=+0.072866420 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 20 10:48:13 np0005588919 nova_compute[225855]: 2026-01-20 15:48:13.105 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:13.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:14.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:15 np0005588919 nova_compute[225855]: 2026-01-20 15:48:15.090 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:15.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:16.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:48:16.470 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:48:16.470 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:48:16.471 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:17.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:18 np0005588919 nova_compute[225855]: 2026-01-20 15:48:18.107 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:18.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:19.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:20 np0005588919 podman[332472]: 2026-01-20 15:48:20.014498125 +0000 UTC m=+0.058397162 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:48:20 np0005588919 nova_compute[225855]: 2026-01-20 15:48:20.092 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:20.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:21.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:22.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:23 np0005588919 nova_compute[225855]: 2026-01-20 15:48:23.111 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:23.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:24.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:25 np0005588919 nova_compute[225855]: 2026-01-20 15:48:25.094 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:25.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:26.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:27.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:28 np0005588919 nova_compute[225855]: 2026-01-20 15:48:28.115 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:28.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:29 np0005588919 nova_compute[225855]: 2026-01-20 15:48:29.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:29 np0005588919 nova_compute[225855]: 2026-01-20 15:48:29.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:48:29 np0005588919 nova_compute[225855]: 2026-01-20 15:48:29.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:48:29 np0005588919 nova_compute[225855]: 2026-01-20 15:48:29.358 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:48:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:29.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:30 np0005588919 nova_compute[225855]: 2026-01-20 15:48:30.096 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:30.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:31.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:32.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:32 np0005588919 nova_compute[225855]: 2026-01-20 15:48:32.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:32 np0005588919 nova_compute[225855]: 2026-01-20 15:48:32.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:48:33 np0005588919 nova_compute[225855]: 2026-01-20 15:48:33.118 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:33 np0005588919 nova_compute[225855]: 2026-01-20 15:48:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:33 np0005588919 nova_compute[225855]: 2026-01-20 15:48:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:33.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:34.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:34 np0005588919 nova_compute[225855]: 2026-01-20 15:48:34.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:48:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:34 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:48:35 np0005588919 nova_compute[225855]: 2026-01-20 15:48:35.097 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:35.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:36.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:36 np0005588919 nova_compute[225855]: 2026-01-20 15:48:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:36 np0005588919 nova_compute[225855]: 2026-01-20 15:48:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.017 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.018 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.018 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.018 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.019 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:48:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:48:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/334353735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.510 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.704 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.706 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4261MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.706 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:48:37 np0005588919 nova_compute[225855]: 2026-01-20 15:48:37.706 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:48:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:37.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:38 np0005588919 nova_compute[225855]: 2026-01-20 15:48:38.122 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:38.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:39.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.099 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.192 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.192 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:48:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:40.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.394 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:48:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:48:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3934550702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.877 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:48:40 np0005588919 nova_compute[225855]: 2026-01-20 15:48:40.883 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:48:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:40 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:48:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:41.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:43 np0005588919 nova_compute[225855]: 2026-01-20 15:48:43.126 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:43.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:43 np0005588919 nova_compute[225855]: 2026-01-20 15:48:43.915 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:48:43 np0005588919 nova_compute[225855]: 2026-01-20 15:48:43.918 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:48:43 np0005588919 nova_compute[225855]: 2026-01-20 15:48:43.918 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:48:44 np0005588919 podman[332779]: 2026-01-20 15:48:44.031664531 +0000 UTC m=+0.079479228 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible)
Jan 20 10:48:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:44.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:44 np0005588919 nova_compute[225855]: 2026-01-20 15:48:44.921 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:44 np0005588919 nova_compute[225855]: 2026-01-20 15:48:44.921 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:48:45 np0005588919 nova_compute[225855]: 2026-01-20 15:48:45.101 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:45.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:46.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:47.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:48 np0005588919 nova_compute[225855]: 2026-01-20 15:48:48.129 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:48.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:49.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:50 np0005588919 nova_compute[225855]: 2026-01-20 15:48:50.139 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:48:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:50.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:48:51 np0005588919 podman[332811]: 2026-01-20 15:48:51.018447587 +0000 UTC m=+0.062959830 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Jan 20 10:48:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:51.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:52.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:53 np0005588919 nova_compute[225855]: 2026-01-20 15:48:53.178 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:53.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:54.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:55 np0005588919 nova_compute[225855]: 2026-01-20 15:48:55.173 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:48:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:55.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:48:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:56.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:57.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:58 np0005588919 nova_compute[225855]: 2026-01-20 15:48:58.181 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:48:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:48:58.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:48:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:48:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:48:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:48:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:48:59.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:00 np0005588919 nova_compute[225855]: 2026-01-20 15:49:00.177 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:00.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:01.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:02.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:03 np0005588919 nova_compute[225855]: 2026-01-20 15:49:03.183 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.838206) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143838296, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1349, "num_deletes": 251, "total_data_size": 3020198, "memory_usage": 3057200, "flush_reason": "Manual Compaction"}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 20 10:49:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:03.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143862552, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 1982070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91001, "largest_seqno": 92345, "table_properties": {"data_size": 1976282, "index_size": 3118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12420, "raw_average_key_size": 19, "raw_value_size": 1964668, "raw_average_value_size": 3153, "num_data_blocks": 139, "num_entries": 623, "num_filter_entries": 623, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768924029, "oldest_key_time": 1768924029, "file_creation_time": 1768924143, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 24465 microseconds, and 9735 cpu microseconds.
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.862685) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 1982070 bytes OK
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.862729) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864450) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864463) EVENT_LOG_v1 {"time_micros": 1768924143864458, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.864483) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 3013864, prev total WAL file size 3013864, number of live WAL files 2.
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.865637) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(1935KB)], [189(11MB)]
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143865668, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 14355705, "oldest_snapshot_seqno": -1}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 10950 keys, 12391529 bytes, temperature: kUnknown
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143959928, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 12391529, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12323581, "index_size": 39483, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27397, "raw_key_size": 290348, "raw_average_key_size": 26, "raw_value_size": 12134572, "raw_average_value_size": 1108, "num_data_blocks": 1490, "num_entries": 10950, "num_filter_entries": 10950, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768924143, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.960235) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 12391529 bytes
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.961616) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.1 rd, 131.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.8 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(13.5) write-amplify(6.3) OK, records in: 11465, records dropped: 515 output_compression: NoCompression
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.961636) EVENT_LOG_v1 {"time_micros": 1768924143961626, "job": 122, "event": "compaction_finished", "compaction_time_micros": 94375, "compaction_time_cpu_micros": 28004, "output_level": 6, "num_output_files": 1, "total_output_size": 12391529, "num_input_records": 11465, "num_output_records": 10950, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143962135, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924143964294, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.865549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:03 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:49:03.964356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:49:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:04.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:05 np0005588919 nova_compute[225855]: 2026-01-20 15:49:05.178 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:05.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:06.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:07.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:08 np0005588919 nova_compute[225855]: 2026-01-20 15:49:08.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:08.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:09.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:10 np0005588919 nova_compute[225855]: 2026-01-20 15:49:10.180 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:10.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:12.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:13 np0005588919 nova_compute[225855]: 2026-01-20 15:49:13.190 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:14.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:15 np0005588919 podman[332893]: 2026-01-20 15:49:15.070601915 +0000 UTC m=+0.116539005 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 20 10:49:15 np0005588919 nova_compute[225855]: 2026-01-20 15:49:15.182 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:15.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:16.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:49:16.471 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:49:16.472 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:49:16.472 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:17.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:18 np0005588919 nova_compute[225855]: 2026-01-20 15:49:18.193 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:18.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:19.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:20 np0005588919 nova_compute[225855]: 2026-01-20 15:49:20.184 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:20.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:21.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:22 np0005588919 podman[332975]: 2026-01-20 15:49:22.014091288 +0000 UTC m=+0.056900039 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 20 10:49:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:22.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:23 np0005588919 nova_compute[225855]: 2026-01-20 15:49:23.196 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:23.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:24.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:25 np0005588919 nova_compute[225855]: 2026-01-20 15:49:25.187 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:25.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:26.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:27.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:28 np0005588919 nova_compute[225855]: 2026-01-20 15:49:28.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:28.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:29.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:30 np0005588919 nova_compute[225855]: 2026-01-20 15:49:30.189 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:30.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:30 np0005588919 nova_compute[225855]: 2026-01-20 15:49:30.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:30 np0005588919 nova_compute[225855]: 2026-01-20 15:49:30.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:49:30 np0005588919 nova_compute[225855]: 2026-01-20 15:49:30.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:49:30 np0005588919 nova_compute[225855]: 2026-01-20 15:49:30.373 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:49:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:32.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:33 np0005588919 nova_compute[225855]: 2026-01-20 15:49:33.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:33 np0005588919 nova_compute[225855]: 2026-01-20 15:49:33.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:34.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:34 np0005588919 nova_compute[225855]: 2026-01-20 15:49:34.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:34 np0005588919 nova_compute[225855]: 2026-01-20 15:49:34.341 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:49:35 np0005588919 nova_compute[225855]: 2026-01-20 15:49:35.190 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:35 np0005588919 nova_compute[225855]: 2026-01-20 15:49:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:35.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:49:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.406 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.406 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.406 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.407 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:49:36 np0005588919 nova_compute[225855]: 2026-01-20 15:49:36.407 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:49:37 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:49:37 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4206473289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.324 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.917s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.489 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.490 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4260MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.491 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.491 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.620 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.621 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.713 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing inventories for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.742 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating ProviderTree inventory for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.743 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Updating inventory in ProviderTree for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.767 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing aggregate associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.805 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Refreshing trait associations for resource provider bbb02880-a710-4ac1-8b2c-5c09765848d1, traits: COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 20 10:49:37 np0005588919 nova_compute[225855]: 2026-01-20 15:49:37.831 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:49:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.205 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:49:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2377546314' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.281 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.287 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:49:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:38.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.308 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.309 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:49:38 np0005588919 nova_compute[225855]: 2026-01-20 15:49:38.310 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.819s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:49:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:39 np0005588919 nova_compute[225855]: 2026-01-20 15:49:39.310 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:40 np0005588919 nova_compute[225855]: 2026-01-20 15:49:40.192 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:40.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:41.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:49:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:42 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:49:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:42.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:42 np0005588919 nova_compute[225855]: 2026-01-20 15:49:42.336 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:43 np0005588919 nova_compute[225855]: 2026-01-20 15:49:43.210 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:43.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:44.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:45 np0005588919 nova_compute[225855]: 2026-01-20 15:49:45.194 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:45.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:46 np0005588919 podman[333230]: 2026-01-20 15:49:46.037972496 +0000 UTC m=+0.083437289 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:49:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:46.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:49:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:47.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:49:48 np0005588919 nova_compute[225855]: 2026-01-20 15:49:48.212 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:49 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:49:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:50 np0005588919 nova_compute[225855]: 2026-01-20 15:49:50.195 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:50.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:51.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:52.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:52 np0005588919 podman[333310]: 2026-01-20 15:49:52.996994098 +0000 UTC m=+0.046881876 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:49:53 np0005588919 nova_compute[225855]: 2026-01-20 15:49:53.215 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:53.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:54.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:55 np0005588919 nova_compute[225855]: 2026-01-20 15:49:55.198 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:55.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:56.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:57.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:58 np0005588919 nova_compute[225855]: 2026-01-20 15:49:58.219 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:49:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:49:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:49:58 np0005588919 nova_compute[225855]: 2026-01-20 15:49:58.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:49:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:49:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:49:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:49:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:49:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:00 np0005588919 nova_compute[225855]: 2026-01-20 15:50:00.200 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:00.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:00 np0005588919 nova_compute[225855]: 2026-01-20 15:50:00.351 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:00 np0005588919 ceph-mon[81775]: overall HEALTH_OK
Jan 20 10:50:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:01.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:02.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:03 np0005588919 nova_compute[225855]: 2026-01-20 15:50:03.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:03.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:04.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:05 np0005588919 nova_compute[225855]: 2026-01-20 15:50:05.202 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:05.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:06.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:07.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:08 np0005588919 nova_compute[225855]: 2026-01-20 15:50:08.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:08 np0005588919 nova_compute[225855]: 2026-01-20 15:50:08.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:08 np0005588919 nova_compute[225855]: 2026-01-20 15:50:08.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 20 10:50:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:08.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:09.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:10 np0005588919 nova_compute[225855]: 2026-01-20 15:50:10.203 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:10.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:11.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:12.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:13 np0005588919 nova_compute[225855]: 2026-01-20 15:50:13.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:13 np0005588919 nova_compute[225855]: 2026-01-20 15:50:13.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:13 np0005588919 nova_compute[225855]: 2026-01-20 15:50:13.360 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 20 10:50:13 np0005588919 nova_compute[225855]: 2026-01-20 15:50:13.396 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 20 10:50:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:13.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:14.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:15 np0005588919 nova_compute[225855]: 2026-01-20 15:50:15.206 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:15.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:16.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:50:16.473 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:50:16.473 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:50:16.473 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:16 np0005588919 podman[333415]: 2026-01-20 15:50:16.803326107 +0000 UTC m=+0.078083798 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 20 10:50:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:17.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:18 np0005588919 nova_compute[225855]: 2026-01-20 15:50:18.230 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:18.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:19 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:19 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:19 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:20 np0005588919 nova_compute[225855]: 2026-01-20 15:50:20.208 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:20.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:21 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:21 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:21 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:21.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:22.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:23 np0005588919 nova_compute[225855]: 2026-01-20 15:50:23.233 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:23 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:23 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:23 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:23.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:24 np0005588919 podman[333471]: 2026-01-20 15:50:24.018454057 +0000 UTC m=+0.067194020 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 20 10:50:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:24.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:25 np0005588919 nova_compute[225855]: 2026-01-20 15:50:25.211 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:25 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:25 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:25 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:25.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:26.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:27 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:27 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:27 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:27.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:28 np0005588919 nova_compute[225855]: 2026-01-20 15:50:28.237 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:28.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:29 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:29 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:29 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:29.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:30 np0005588919 nova_compute[225855]: 2026-01-20 15:50:30.211 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:30.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:31 np0005588919 nova_compute[225855]: 2026-01-20 15:50:31.377 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:31 np0005588919 nova_compute[225855]: 2026-01-20 15:50:31.377 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:50:31 np0005588919 nova_compute[225855]: 2026-01-20 15:50:31.378 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:50:31 np0005588919 nova_compute[225855]: 2026-01-20 15:50:31.391 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:50:31 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:31 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:31 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:31.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:32.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:33 np0005588919 nova_compute[225855]: 2026-01-20 15:50:33.266 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:33 np0005588919 nova_compute[225855]: 2026-01-20 15:50:33.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:33 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:33 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:33 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:33.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:34.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:35 np0005588919 nova_compute[225855]: 2026-01-20 15:50:35.213 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:35 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:35 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:35 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:36 np0005588919 nova_compute[225855]: 2026-01-20 15:50:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:36 np0005588919 nova_compute[225855]: 2026-01-20 15:50:36.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:36 np0005588919 nova_compute[225855]: 2026-01-20 15:50:36.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:50:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:36.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:37 np0005588919 nova_compute[225855]: 2026-01-20 15:50:37.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:37 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:37 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:37 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:37.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.270 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.373 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.374 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.375 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:50:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:38.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:38 np0005588919 nova_compute[225855]: 2026-01-20 15:50:38.858 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.077 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.079 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4272MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.079 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.079 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.149 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.149 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.184 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:50:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:50:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1405845245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.638 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.644 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.657 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.658 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:50:39 np0005588919 nova_compute[225855]: 2026-01-20 15:50:39.659 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:50:39 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:39 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:39 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:39.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:40 np0005588919 nova_compute[225855]: 2026-01-20 15:50:40.215 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:40.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:41 np0005588919 nova_compute[225855]: 2026-01-20 15:50:41.208 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:41 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:41 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:41 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:41.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:42.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:43 np0005588919 nova_compute[225855]: 2026-01-20 15:50:43.319 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:43 np0005588919 nova_compute[225855]: 2026-01-20 15:50:43.360 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:43 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:43 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:43 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:44.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:45 np0005588919 nova_compute[225855]: 2026-01-20 15:50:45.217 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:45 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:45 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:45 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:46.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:47 np0005588919 podman[333596]: 2026-01-20 15:50:47.016724951 +0000 UTC m=+0.064848273 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 20 10:50:47 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:47 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:47 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:48 np0005588919 nova_compute[225855]: 2026-01-20 15:50:48.338 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:48.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:49 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:49 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:49 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:50 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 20 10:50:50 np0005588919 nova_compute[225855]: 2026-01-20 15:50:50.219 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:50.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:50 np0005588919 nova_compute[225855]: 2026-01-20 15:50:50.947 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:50:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 20 10:50:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:50:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:51 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:50:51 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:51 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:51 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:50:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:52.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:50:53 np0005588919 nova_compute[225855]: 2026-01-20 15:50:53.376 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:53 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:53 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:53 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:53.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:50:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:54.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:50:55 np0005588919 podman[333875]: 2026-01-20 15:50:55.034422661 +0000 UTC m=+0.085265780 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 20 10:50:55 np0005588919 nova_compute[225855]: 2026-01-20 15:50:55.222 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:55 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:55 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:55 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:55.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:56.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:56 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:50:57 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:57 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:57 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:57.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:58 np0005588919 nova_compute[225855]: 2026-01-20 15:50:58.410 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:50:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:50:58.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:50:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:50:59 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:50:59 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:50:59 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:50:59.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:00 np0005588919 nova_compute[225855]: 2026-01-20 15:51:00.225 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:00.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:01 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:01 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:01 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:01.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:02.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:03 np0005588919 nova_compute[225855]: 2026-01-20 15:51:03.413 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:03 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:03 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:03 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:03.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:04.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:05 np0005588919 nova_compute[225855]: 2026-01-20 15:51:05.227 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:05 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:05 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:05 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:05.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:06.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:07 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:07 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:07 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:07.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:08.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:08 np0005588919 nova_compute[225855]: 2026-01-20 15:51:08.453 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:09 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:09 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:09 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:09.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:10 np0005588919 nova_compute[225855]: 2026-01-20 15:51:10.229 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:10.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:11 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:11 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:11 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:11.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:12.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:13 np0005588919 nova_compute[225855]: 2026-01-20 15:51:13.510 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 20 10:51:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3537550131' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 20 10:51:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 20 10:51:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3537550131' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 20 10:51:13 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:13 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:13 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:13.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:14.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:15 np0005588919 nova_compute[225855]: 2026-01-20 15:51:15.231 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:15 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:15 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:15 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:15.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:16.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:51:16.474 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:51:16.474 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:51:16.474 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:17 np0005588919 podman[334028]: 2026-01-20 15:51:17.42758877 +0000 UTC m=+0.109293958 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 20 10:51:17 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:17 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:17 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:17.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:18.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:18 np0005588919 nova_compute[225855]: 2026-01-20 15:51:18.512 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:19.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:20 np0005588919 nova_compute[225855]: 2026-01-20 15:51:20.233 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:20.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:22.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:22.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:23 np0005588919 nova_compute[225855]: 2026-01-20 15:51:23.546 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:24.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:24.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:25 np0005588919 nova_compute[225855]: 2026-01-20 15:51:25.236 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:26.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:26 np0005588919 podman[334085]: 2026-01-20 15:51:26.053805609 +0000 UTC m=+0.085213019 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 20 10:51:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:26.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:28.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:28.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:28 np0005588919 nova_compute[225855]: 2026-01-20 15:51:28.549 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:30.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:30 np0005588919 nova_compute[225855]: 2026-01-20 15:51:30.237 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:30.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:32.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:32.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.343 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.344 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.344 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.359 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.359 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:33 np0005588919 nova_compute[225855]: 2026-01-20 15:51:33.552 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:34.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:34.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:35 np0005588919 nova_compute[225855]: 2026-01-20 15:51:35.240 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:36.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:36.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:38.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.434 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.435 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.435 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.435 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.436 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:51:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:38.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.555 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:38 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:51:38 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/701871106' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:51:38 np0005588919 nova_compute[225855]: 2026-01-20 15:51:38.873 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.068 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.069 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4259MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.070 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.070 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:51:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.230 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.231 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.255 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:51:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:51:39 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1389155706' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.686 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.693 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.851 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.854 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:51:39 np0005588919 nova_compute[225855]: 2026-01-20 15:51:39.854 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:51:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:40.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:40 np0005588919 nova_compute[225855]: 2026-01-20 15:51:40.241 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:40.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:40 np0005588919 nova_compute[225855]: 2026-01-20 15:51:40.854 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:40 np0005588919 nova_compute[225855]: 2026-01-20 15:51:40.855 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:42.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:42.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:43 np0005588919 nova_compute[225855]: 2026-01-20 15:51:43.559 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:44.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:44.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:45 np0005588919 nova_compute[225855]: 2026-01-20 15:51:45.243 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:45 np0005588919 nova_compute[225855]: 2026-01-20 15:51:45.335 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:51:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 20 10:51:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:46.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 20 10:51:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:46.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:48.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:48 np0005588919 podman[334211]: 2026-01-20 15:51:48.091969576 +0000 UTC m=+0.129221023 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 20 10:51:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:48.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:48 np0005588919 nova_compute[225855]: 2026-01-20 15:51:48.560 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:50.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:50 np0005588919 nova_compute[225855]: 2026-01-20 15:51:50.245 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:50.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:52.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:52.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:53 np0005588919 nova_compute[225855]: 2026-01-20 15:51:53.563 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:54.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:51:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:54.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:55 np0005588919 nova_compute[225855]: 2026-01-20 15:51:55.247 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:56.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:51:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:56.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:51:57 np0005588919 podman[334241]: 2026-01-20 15:51:57.020718714 +0000 UTC m=+0.065244265 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:51:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:51:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:51:58.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:51:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:51:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:51:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:51:58.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:51:58 np0005588919 nova_compute[225855]: 2026-01-20 15:51:58.568 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:51:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 20 10:51:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:51:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:51:58 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:51:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:00.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:00 np0005588919 nova_compute[225855]: 2026-01-20 15:52:00.251 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:00.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:02.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:02.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:03 np0005588919 nova_compute[225855]: 2026-01-20 15:52:03.583 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:04.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:04.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:52:04 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:52:05 np0005588919 nova_compute[225855]: 2026-01-20 15:52:05.253 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:05 np0005588919 nova_compute[225855]: 2026-01-20 15:52:05.334 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:06.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:06.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:08.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:08.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:08 np0005588919 nova_compute[225855]: 2026-01-20 15:52:08.585 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:10.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:10 np0005588919 nova_compute[225855]: 2026-01-20 15:52:10.254 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:10.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:12.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:12.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:13 np0005588919 nova_compute[225855]: 2026-01-20 15:52:13.588 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:14.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:14.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:15 np0005588919 nova_compute[225855]: 2026-01-20 15:52:15.257 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:16.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:52:16.475 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:52:16.475 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:52:16.476 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:16.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:18.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:18.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:18 np0005588919 nova_compute[225855]: 2026-01-20 15:52:18.591 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:19 np0005588919 podman[334552]: 2026-01-20 15:52:19.089830177 +0000 UTC m=+0.126619399 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 20 10:52:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:20 np0005588919 nova_compute[225855]: 2026-01-20 15:52:20.259 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:20.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:22.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:22.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:23 np0005588919 nova_compute[225855]: 2026-01-20 15:52:23.594 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:24.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:24.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:25 np0005588919 nova_compute[225855]: 2026-01-20 15:52:25.262 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:26.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:26.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:28 np0005588919 podman[334585]: 2026-01-20 15:52:28.009341635 +0000 UTC m=+0.055661044 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 20 10:52:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:28.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:28.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:28 np0005588919 nova_compute[225855]: 2026-01-20 15:52:28.598 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:30.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:30 np0005588919 nova_compute[225855]: 2026-01-20 15:52:30.263 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:30.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:32.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:32 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:32 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 20 10:52:32 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:32.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 20 10:52:33 np0005588919 nova_compute[225855]: 2026-01-20 15:52:33.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:33 np0005588919 nova_compute[225855]: 2026-01-20 15:52:33.601 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:34.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:34 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:34 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:34 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:34 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:34.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:35 np0005588919 nova_compute[225855]: 2026-01-20 15:52:35.299 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:35 np0005588919 nova_compute[225855]: 2026-01-20 15:52:35.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:35 np0005588919 nova_compute[225855]: 2026-01-20 15:52:35.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 20 10:52:35 np0005588919 nova_compute[225855]: 2026-01-20 15:52:35.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 20 10:52:35 np0005588919 nova_compute[225855]: 2026-01-20 15:52:35.362 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 20 10:52:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:36.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:36 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:36 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:36 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:36.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:38.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:38 np0005588919 nova_compute[225855]: 2026-01-20 15:52:38.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:38 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:38 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:38 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:38.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:38 np0005588919 nova_compute[225855]: 2026-01-20 15:52:38.605 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:39 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:39 np0005588919 nova_compute[225855]: 2026-01-20 15:52:39.340 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.300 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.339 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.340 225859 DEBUG nova.compute.manager [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.341 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.374 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.375 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.375 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.375 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:52:40 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:40 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:40 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:40.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:40 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:52:40 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1369305377' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:52:40 np0005588919 nova_compute[225855]: 2026-01-20 15:52:40.852 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.027 225859 WARNING nova.virt.libvirt.driver [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.029 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4251MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.030 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.031 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.089 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.090 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.115 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 20 10:52:41 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 20 10:52:41 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/405040981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.681 225859 DEBUG oslo_concurrency.processutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.688 225859 DEBUG nova.compute.provider_tree [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed in ProviderTree for provider: bbb02880-a710-4ac1-8b2c-5c09765848d1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.845 225859 DEBUG nova.scheduler.client.report [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Inventory has not changed for provider bbb02880-a710-4ac1-8b2c-5c09765848d1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.847 225859 DEBUG nova.compute.resource_tracker [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 20 10:52:41 np0005588919 nova_compute[225855]: 2026-01-20 15:52:41.847 225859 DEBUG oslo_concurrency.lockutils [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:52:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:42 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:42 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:42 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:42.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:42 np0005588919 nova_compute[225855]: 2026-01-20 15:52:42.848 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:42 np0005588919 nova_compute[225855]: 2026-01-20 15:52:42.849 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:43 np0005588919 nova_compute[225855]: 2026-01-20 15:52:43.609 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:44.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.459587) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364459682, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2347, "num_deletes": 251, "total_data_size": 5826467, "memory_usage": 5894800, "flush_reason": "Manual Compaction"}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364498538, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3812952, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 92350, "largest_seqno": 94692, "table_properties": {"data_size": 3803442, "index_size": 6003, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19248, "raw_average_key_size": 20, "raw_value_size": 3784588, "raw_average_value_size": 3992, "num_data_blocks": 262, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768924145, "oldest_key_time": 1768924145, "file_creation_time": 1768924364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 39001 microseconds, and 7615 cpu microseconds.
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.498581) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3812952 bytes OK
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.498606) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.500048) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.500060) EVENT_LOG_v1 {"time_micros": 1768924364500056, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.500080) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5816207, prev total WAL file size 5816207, number of live WAL files 2.
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.501399) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3723KB)], [192(11MB)]
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364501460, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 16204481, "oldest_snapshot_seqno": -1}
Jan 20 10:52:44 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:44 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:52:44 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:44.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11379 keys, 14207870 bytes, temperature: kUnknown
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364645302, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 14207870, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14135506, "index_size": 42811, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28485, "raw_key_size": 299918, "raw_average_key_size": 26, "raw_value_size": 13937506, "raw_average_value_size": 1224, "num_data_blocks": 1628, "num_entries": 11379, "num_filter_entries": 11379, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1768917474, "oldest_key_time": 0, "file_creation_time": 1768924364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1539d774-8a6f-4e48-b253-137c44586344", "db_session_id": "LFF7G2OZDOU7TKQ8MKAH", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.645558) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 14207870 bytes
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.668665) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 112.6 rd, 98.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.8 +0.0 blob) out(13.5 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 11898, records dropped: 519 output_compression: NoCompression
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.668709) EVENT_LOG_v1 {"time_micros": 1768924364668692, "job": 124, "event": "compaction_finished", "compaction_time_micros": 143931, "compaction_time_cpu_micros": 32264, "output_level": 6, "num_output_files": 1, "total_output_size": 14207870, "num_input_records": 11898, "num_output_records": 11379, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364669764, "job": 124, "event": "table_file_deletion", "file_number": 194}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: EVENT_LOG_v1 {"time_micros": 1768924364672596, "job": 124, "event": "table_file_deletion", "file_number": 192}
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.501248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.672672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.672677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.672678) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.672680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:44 np0005588919 ceph-mon[81775]: rocksdb: (Original Log Time 2026/01/20-15:52:44.672681) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 20 10:52:45 np0005588919 nova_compute[225855]: 2026-01-20 15:52:45.302 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:46.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:46 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:46 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:46 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:46.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:47 np0005588919 nova_compute[225855]: 2026-01-20 15:52:47.337 225859 DEBUG oslo_service.periodic_task [None req-7a792283-28fa-4bd6-942f-00a63e52f6c5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 20 10:52:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:48.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:48 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:48 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:48 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:48.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:48 np0005588919 nova_compute[225855]: 2026-01-20 15:52:48.614 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:49 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:50 np0005588919 podman[334710]: 2026-01-20 15:52:50.101103859 +0000 UTC m=+0.142437837 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 20 10:52:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:50 np0005588919 nova_compute[225855]: 2026-01-20 15:52:50.304 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:50 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:50 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:50 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:50.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:52 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:52 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:52 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:52.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:53 np0005588919 nova_compute[225855]: 2026-01-20 15:52:53.616 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:54 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:52:54 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:54 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:54 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:54.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:55 np0005588919 nova_compute[225855]: 2026-01-20 15:52:55.306 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:56.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:56 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:56 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:56 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:56.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:52:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:52:58.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:52:58 np0005588919 podman[334766]: 2026-01-20 15:52:58.210729817 +0000 UTC m=+0.076468562 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 20 10:52:58 np0005588919 nova_compute[225855]: 2026-01-20 15:52:58.620 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:52:58 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:52:58 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:52:58 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:52:58.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:52:59 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:00.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:00 np0005588919 nova_compute[225855]: 2026-01-20 15:53:00.309 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:00 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:00 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:00 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:00.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:01 np0005588919 systemd-logind[783]: New session 75 of user zuul.
Jan 20 10:53:01 np0005588919 systemd[1]: Started Session 75 of User zuul.
Jan 20 10:53:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:02.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:02 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:02 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:02 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:02.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:03 np0005588919 nova_compute[225855]: 2026-01-20 15:53:03.623 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:04.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:04 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:04 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:04 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:04 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:04.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:05 np0005588919 podman[335162]: 2026-01-20 15:53:05.010336572 +0000 UTC m=+0.081529795 container exec 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 20 10:53:05 np0005588919 podman[335162]: 2026-01-20 15:53:05.130488858 +0000 UTC m=+0.201682111 container exec_died 718ebba7a543e42aad7051248d2c7dc014068c35c89c5b87f27b82d4de39c009 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 20 10:53:05 np0005588919 nova_compute[225855]: 2026-01-20 15:53:05.310 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:05 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 20 10:53:05 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1223251623' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 20 10:53:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:06.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:06 np0005588919 podman[335371]: 2026-01-20 15:53:06.172790835 +0000 UTC m=+0.480730237 container exec 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:53:06 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:06 np0005588919 podman[335371]: 2026-01-20 15:53:06.212365144 +0000 UTC m=+0.520304496 container exec_died 25e2c3387bc15944c21038272559da5fbf75910d8dd4add0faa995fb4e0f7788 (image=quay.io/ceph/haproxy:2.3, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-haproxy-rgw-default-compute-1-uyeocq)
Jan 20 10:53:06 np0005588919 podman[335462]: 2026-01-20 15:53:06.494857767 +0000 UTC m=+0.063582388 container exec e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, com.redhat.component=keepalived-container, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vcs-type=git, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 20 10:53:06 np0005588919 podman[335462]: 2026-01-20 15:53:06.510037106 +0000 UTC m=+0.078761687 container exec_died e27b69e4cc956b06482c80498336e112a56122514cd7345d3d4b39a4d206f962 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e399cf45-e6b6-5393-99f1-75c601d3f188-keepalived-rgw-default-compute-1-cevitz, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, architecture=x86_64, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived)
Jan 20 10:53:06 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:06 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:06 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:06.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:07 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:08.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 20 10:53:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:08 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 20 10:53:08 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:08 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:08 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:08.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:08 np0005588919 nova_compute[225855]: 2026-01-20 15:53:08.668 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:09 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:09 np0005588919 ovs-vsctl[335664]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 20 10:53:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:10 np0005588919 nova_compute[225855]: 2026-01-20 15:53:10.311 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:10 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:10 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:10 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:11 np0005588919 virtqemud[225396]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 20 10:53:11 np0005588919 virtqemud[225396]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 20 10:53:11 np0005588919 virtqemud[225396]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 20 10:53:11 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: cache status {prefix=cache status} (starting...)
Jan 20 10:53:11 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:11 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: client ls {prefix=client ls} (starting...)
Jan 20 10:53:11 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:11 np0005588919 lvm[335984]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 20 10:53:11 np0005588919 lvm[335984]: VG ceph_vg0 finished
Jan 20 10:53:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:12.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: damage ls {prefix=damage ls} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump loads {prefix=dump loads} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:12 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 20 10:53:12 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3384103310' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 20 10:53:12 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:12 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:12 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:12.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 20 10:53:12 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/620047485' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3593184449' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: ops {prefix=ops} (starting...)
Jan 20 10:53:13 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:13 np0005588919 nova_compute[225855]: 2026-01-20 15:53:13.676 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1882969061' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: from='mgr.14132 192.168.122.100:0/3912305496' entity='mgr.compute-0.wookjv' 
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 20 10:53:13 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4139919592' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 20 10:53:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:14.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 20 10:53:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817991170' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 20 10:53:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:14 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: session ls {prefix=session ls} (starting...)
Jan 20 10:53:14 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx Can't run that command on an inactive MDS!
Jan 20 10:53:14 np0005588919 ceph-mds[84722]: mds.cephfs.compute-1.rtofcx asok_command: status {prefix=status} (starting...)
Jan 20 10:53:14 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 20 10:53:14 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2827410809' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 20 10:53:14 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:14 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:14 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:14.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/893498978' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2579756407' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 20 10:53:15 np0005588919 nova_compute[225855]: 2026-01-20 15:53:15.311 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/22520359' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 20 10:53:15 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3071146643' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1908376492' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 20 10:53:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:16.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:53:16.476 140354 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 20 10:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:53:16.477 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 20 10:53:16 np0005588919 ovn_metadata_agent[140349]: 2026-01-20 15:53:16.477 140354 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/693312015' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 20 10:53:16 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2432965860' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 20 10:53:16 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:16 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:16 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:17 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 20 10:53:17 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/808794295' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447578112 unmapped: 72196096 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447578112 unmapped: 72196096 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5051104 data_alloc: 234881024 data_used: 27340800
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 heartbeat osd_stat(store_statfs(0x19f901000/0x0/0x1bfc00000, data 0x5698bfa/0x58bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447578112 unmapped: 72196096 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 heartbeat osd_stat(store_statfs(0x19f901000/0x0/0x1bfc00000, data 0x5698bfa/0x58bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447586304 unmapped: 72187904 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447586304 unmapped: 72187904 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.583354950s of 13.874148369s, submitted: 116
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447586304 unmapped: 72187904 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447586304 unmapped: 72187904 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5051172 data_alloc: 234881024 data_used: 27344896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 ms_handle_reset con 0x557dc20ef400 session 0x557dc0121e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 ms_handle_reset con 0x557dc2367c00 session 0x557dc308c960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447586304 unmapped: 72187904 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 heartbeat osd_stat(store_statfs(0x19f900000/0x0/0x1bfc00000, data 0x5699bfa/0x58be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5051184 data_alloc: 234881024 data_used: 27344896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 heartbeat osd_stat(store_statfs(0x19f8fd000/0x0/0x1bfc00000, data 0x569cbfa/0x58c1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447594496 unmapped: 72179712 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.923791885s of 10.021304131s, submitted: 7
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 heartbeat osd_stat(store_statfs(0x19f8fd000/0x0/0x1bfc00000, data 0x569cbfa/0x58c1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 ms_handle_reset con 0x557dcb8db800 session 0x557dc208a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 ms_handle_reset con 0x557dc2c47400 session 0x557dc20954a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc30ea780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc20ef400 session 0x557dc0734960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 heartbeat osd_stat(store_statfs(0x19f8fd000/0x0/0x1bfc00000, data 0x569cbfa/0x58c1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc2367c00 session 0x557dc01214a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447602688 unmapped: 72171520 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dcb8db800 session 0x557dc30fc000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447610880 unmapped: 72163328 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc10ab400 session 0x557dc01a5680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5056788 data_alloc: 234881024 data_used: 27353088
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447610880 unmapped: 72163328 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447610880 unmapped: 72163328 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447619072 unmapped: 72155136 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447627264 unmapped: 72146944 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 heartbeat osd_stat(store_statfs(0x19f8f4000/0x0/0x1bfc00000, data 0x56a38b5/0x58ca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1aa3f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc25b1c00 session 0x557dc0eddc20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc5058400 session 0x557dc13f63c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447627264 unmapped: 72146944 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5055592 data_alloc: 234881024 data_used: 27353088
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc20ef400 session 0x557dc138a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447660032 unmapped: 72114176 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447660032 unmapped: 72114176 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 heartbeat osd_stat(store_statfs(0x19f6b0000/0x0/0x1bfc00000, data 0x54d9895/0x56fe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dc2367c00 session 0x557dc30ea1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 ms_handle_reset con 0x557dcb8db800 session 0x557dc3156960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447660032 unmapped: 72114176 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 416 handle_osd_map epochs [417,417], i have 416, src has [1,417]
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447668224 unmapped: 72105984 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.301989555s of 11.480295181s, submitted: 61
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 ms_handle_reset con 0x557dc20ef400 session 0x557dc0e54960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447676416 unmapped: 72097792 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5038646 data_alloc: 234881024 data_used: 27234304
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447676416 unmapped: 72097792 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 ms_handle_reset con 0x557dc2367c00 session 0x557dc106af00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 heartbeat osd_stat(store_statfs(0x19f6ad000/0x0/0x1bfc00000, data 0x54db4e0/0x5700000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 ms_handle_reset con 0x557dc25b1c00 session 0x557dc01a5c20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447692800 unmapped: 72081408 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 ms_handle_reset con 0x557dc5058400 session 0x557dc01a5680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 ms_handle_reset con 0x557dc2c47400 session 0x557dc0734960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447709184 unmapped: 72065024 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5042789 data_alloc: 234881024 data_used: 28385280
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 heartbeat osd_stat(store_statfs(0x19f6af000/0x0/0x1bfc00000, data 0x54db4d1/0x56ff000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447717376 unmapped: 72056832 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f6aa000/0x0/0x1bfc00000, data 0x54e04d1/0x5704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447733760 unmapped: 72040448 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5049123 data_alloc: 234881024 data_used: 28393472
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447741952 unmapped: 72032256 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 447741952 unmapped: 72032256 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc31572c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2082960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.068335533s of 13.254554749s, submitted: 70
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445775872 unmapped: 73998336 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30eb0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445784064 unmapped: 73990144 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445792256 unmapped: 73981952 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445800448 unmapped: 73973760 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445808640 unmapped: 73965568 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445808640 unmapped: 73965568 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445808640 unmapped: 73965568 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445816832 unmapped: 73957376 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445816832 unmapped: 73957376 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0248000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445816832 unmapped: 73957376 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921289 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc2083a40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25b1c00 session 0x557dc0145a40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc01203c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 445816832 unmapped: 73957376 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc30ebe00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 28.219793320s of 28.305349350s, submitted: 17
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc138a000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc22ce1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc5058400 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc5058400 session 0x557dc2083860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2330d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4959010 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4959010 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc30fd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446021632 unmapped: 73752576 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4977862 data_alloc: 234881024 data_used: 26775552
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446029824 unmapped: 73744384 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4977862 data_alloc: 234881024 data_used: 26775552
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fe9a000/0x0/0x1bfc00000, data 0x4cef010/0x4f14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 446038016 unmapped: 73736192 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.273998260s of 23.385925293s, submitted: 23
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448405504 unmapped: 71368704 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5060854 data_alloc: 234881024 data_used: 27189248
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f40d000/0x0/0x1bfc00000, data 0x576d010/0x5992000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f40d000/0x0/0x1bfc00000, data 0x576d010/0x5992000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5073154 data_alloc: 234881024 data_used: 27004928
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448421888 unmapped: 71352320 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448176128 unmapped: 71598080 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448176128 unmapped: 71598080 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448176128 unmapped: 71598080 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.343052864s of 10.078209877s, submitted: 109
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0edd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc30f0000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f416000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448184320 unmapped: 71589888 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5066250 data_alloc: 234881024 data_used: 26992640
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448184320 unmapped: 71589888 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448184320 unmapped: 71589888 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 20 10:53:18 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3140472892' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5066250 data_alloc: 234881024 data_used: 26992640
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc0edc000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448176128 unmapped: 71598080 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5067822 data_alloc: 234881024 data_used: 27152384
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5067822 data_alloc: 234881024 data_used: 27152384
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448135168 unmapped: 71639040 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.337461472s of 19.394742966s, submitted: 3
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448192512 unmapped: 71581696 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5078282 data_alloc: 234881024 data_used: 28155904
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5078282 data_alloc: 234881024 data_used: 28155904
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f417000/0x0/0x1bfc00000, data 0x5772010/0x5997000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ae4f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448200704 unmapped: 71573504 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0fe6b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc22ebc20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eff1000/0x0/0x1bfc00000, data 0x5778010/0x599d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b26f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.055052757s of 10.106260300s, submitted: 23
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5078616 data_alloc: 234881024 data_used: 28155904
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc2330780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4933724 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4933724 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448143360 unmapped: 71630848 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448151552 unmapped: 71622656 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4933724 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448159744 unmapped: 71614464 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4933724 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1288000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4933724 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448167936 unmapped: 71606272 heap: 519774208 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 26.636264801s of 27.555721283s, submitted: 34
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc5058400 session 0x557dc05d9860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc30f0f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc33ca960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc13f6f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc150e1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448266240 unmapped: 75710464 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448266240 unmapped: 75710464 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ec000/0x0/0x1bfc00000, data 0x54de000/0x5702000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ec000/0x0/0x1bfc00000, data 0x54de000/0x5702000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5023552 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ec000/0x0/0x1bfc00000, data 0x54de000/0x5702000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2418000 session 0x557dc011e960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc13f74a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc30f10e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc1a56000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448274432 unmapped: 75702272 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5030803 data_alloc: 218103808 data_used: 24649728
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ea000/0x0/0x1bfc00000, data 0x54de033/0x5704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104883 data_alloc: 234881024 data_used: 34652160
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ea000/0x0/0x1bfc00000, data 0x54de033/0x5704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104883 data_alloc: 234881024 data_used: 34652160
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 448290816 unmapped: 75685888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.314420700s of 20.434036255s, submitted: 19
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450478080 unmapped: 73498624 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a06ea000/0x0/0x1bfc00000, data 0x54de033/0x5704000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450494464 unmapped: 73482240 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5168343 data_alloc: 234881024 data_used: 34938880
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5168343 data_alloc: 234881024 data_used: 34938880
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450584576 unmapped: 73392128 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc234be00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc26ca800 session 0x557dc308de00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5168343 data_alloc: 234881024 data_used: 34938880
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0008000/0x0/0x1bfc00000, data 0x5bb1033/0x5dd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450592768 unmapped: 73383936 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc2082960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450600960 unmapped: 73375744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5168503 data_alloc: 234881024 data_used: 34942976
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.522113800s of 18.740686417s, submitted: 73
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30f01e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450609152 unmapped: 73367552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450609152 unmapped: 73367552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450609152 unmapped: 73367552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450617344 unmapped: 73359360 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450617344 unmapped: 73359360 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450617344 unmapped: 73359360 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450617344 unmapped: 73359360 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450625536 unmapped: 73351168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450625536 unmapped: 73351168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450625536 unmapped: 73351168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450633728 unmapped: 73342976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450641920 unmapped: 73334784 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450650112 unmapped: 73326592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450650112 unmapped: 73326592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450650112 unmapped: 73326592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450658304 unmapped: 73318400 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450658304 unmapped: 73318400 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450658304 unmapped: 73318400 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450658304 unmapped: 73318400 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450666496 unmapped: 73310208 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4944778 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450674688 unmapped: 73302016 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1287000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc308d4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc308c960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 450682880 unmapped: 73293824 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc01a52c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc011e1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 42.948684692s of 43.016662598s, submitted: 27
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc07352c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc22ceb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc2330780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc0edd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0d3e3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4982133 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0eb7000/0x0/0x1bfc00000, data 0x4d12062/0x4f37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0eb7000/0x0/0x1bfc00000, data 0x4d12062/0x4f37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451747840 unmapped: 72228864 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc20c61e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4982266 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451756032 unmapped: 72220672 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 451756032 unmapped: 72220672 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0eb6000/0x0/0x1bfc00000, data 0x4d12085/0x4f38000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009786 data_alloc: 234881024 data_used: 27885568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0eb6000/0x0/0x1bfc00000, data 0x4d12085/0x4f38000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009786 data_alloc: 234881024 data_used: 27885568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453722112 unmapped: 70254592 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.696350098s of 19.785596848s, submitted: 33
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 453976064 unmapped: 70000640 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 456605696 unmapped: 67371008 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a04b8000/0x0/0x1bfc00000, data 0x5708085/0x592e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457121792 unmapped: 66854912 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5100816 data_alloc: 234881024 data_used: 28143616
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457121792 unmapped: 66854912 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a046b000/0x0/0x1bfc00000, data 0x574f085/0x5975000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457121792 unmapped: 66854912 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457121792 unmapped: 66854912 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457121792 unmapped: 66854912 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457129984 unmapped: 66846720 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092420 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0458000/0x0/0x1bfc00000, data 0x5770085/0x5996000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457129984 unmapped: 66846720 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457129984 unmapped: 66846720 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0458000/0x0/0x1bfc00000, data 0x5770085/0x5996000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457129984 unmapped: 66846720 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457129984 unmapped: 66846720 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092420 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.340342522s of 13.752699852s, submitted: 111
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0458000/0x0/0x1bfc00000, data 0x5770085/0x5996000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092664 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457138176 unmapped: 66838528 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457146368 unmapped: 66830336 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457146368 unmapped: 66830336 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457146368 unmapped: 66830336 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092664 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457146368 unmapped: 66830336 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457146368 unmapped: 66830336 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457154560 unmapped: 66822144 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457154560 unmapped: 66822144 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457154560 unmapped: 66822144 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092664 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.195745468s of 18.203834534s, submitted: 2
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092664 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457162752 unmapped: 66813952 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457170944 unmapped: 66805760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a044f000/0x0/0x1bfc00000, data 0x5779085/0x599f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242dc00 session 0x557dc31561e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc4082000 session 0x557dc30ebe00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2320d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc2094000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc22cf680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5114082 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5114082 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 65445888 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458539008 unmapped: 65437696 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5114082 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242dc00 session 0x557dc30f03c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458547200 unmapped: 65429504 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458555392 unmapped: 65421312 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5114082 data_alloc: 234881024 data_used: 28147712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5120162 data_alloc: 234881024 data_used: 28934144
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a0260000/0x0/0x1bfc00000, data 0x5968085/0x5b8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458596352 unmapped: 65380352 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458604544 unmapped: 65372160 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 28.897705078s of 28.928741455s, submitted: 14
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458604544 unmapped: 65372160 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458604544 unmapped: 65372160 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5119710 data_alloc: 234881024 data_used: 28934144
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458612736 unmapped: 65363968 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459366400 unmapped: 64610304 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e85a000/0x0/0x1bfc00000, data 0x61c8085/0x63ee000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5187954 data_alloc: 234881024 data_used: 30027776
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e841000/0x0/0x1bfc00000, data 0x61e7085/0x640d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5187954 data_alloc: 234881024 data_used: 30027776
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.731851578s of 12.032382011s, submitted: 91
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 63135744 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e841000/0x0/0x1bfc00000, data 0x61e7085/0x640d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460849152 unmapped: 63127552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 64K writes, 249K keys, 64K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.04 MB/s#012Cumulative WAL: 64K writes, 24K syncs, 2.64 writes per sync, written: 0.24 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4629 writes, 15K keys, 4629 commit groups, 1.0 writes per commit group, ingest: 15.94 MB, 0.03 MB/s#012Interval WAL: 4629 writes, 1847 syncs, 2.51 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460849152 unmapped: 63127552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc72e4800 session 0x557dc0edd2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0121680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460849152 unmapped: 63127552 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc3122f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2a6000/0x0/0x1bfc00000, data 0x5782085/0x59a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5099508 data_alloc: 234881024 data_used: 28135424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2a6000/0x0/0x1bfc00000, data 0x5782085/0x59a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2a6000/0x0/0x1bfc00000, data 0x5782085/0x59a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2a6000/0x0/0x1bfc00000, data 0x5782085/0x59a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460865536 unmapped: 63111168 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5099508 data_alloc: 234881024 data_used: 28135424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc011b400 session 0x557dc13dd2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: mgrc ms_handle_reset ms_handle_reset con 0x557dc33fc400
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2542147622
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2542147622,v1:192.168.122.100:6801/2542147622]
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: mgrc handle_mgr_configure stats_period=5
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc239c000 session 0x557dc20c70e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc217e400 session 0x557dc301b680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2a6000/0x0/0x1bfc00000, data 0x5782085/0x59a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5099508 data_alloc: 234881024 data_used: 28135424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.522294044s of 14.739879608s, submitted: 17
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc23303c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc01a43c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460873728 unmapped: 63102976 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc22ea960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e1000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4963340 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455122944 unmapped: 68853760 heap: 523976704 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc13dd2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0f1c3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc3122f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0edd2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 33.626598358s of 33.775032043s, submitted: 37
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc30f03c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455041024 unmapped: 72089600 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc0d3e3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0edd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f8b2000/0x0/0x1bfc00000, data 0x5176039/0x539c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc22ceb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc011e1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455041024 unmapped: 72089600 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5030023 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455049216 unmapped: 72081408 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455049216 unmapped: 72081408 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc308c960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc308d4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455049216 unmapped: 72081408 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2082960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454991872 unmapped: 72138752 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f8b0000/0x0/0x1bfc00000, data 0x51760a5/0x539e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5094631 data_alloc: 234881024 data_used: 32604160
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f8b0000/0x0/0x1bfc00000, data 0x51760a5/0x539e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5094631 data_alloc: 234881024 data_used: 32604160
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f8b0000/0x0/0x1bfc00000, data 0x51760a5/0x539e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 454803456 unmapped: 72327168 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5094631 data_alloc: 234881024 data_used: 32604160
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.654638290s of 16.778350830s, submitted: 40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457662464 unmapped: 69468160 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 457662464 unmapped: 69468160 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458964992 unmapped: 68165632 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2e9000/0x0/0x1bfc00000, data 0x57270a5/0x594f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458973184 unmapped: 68157440 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2e9000/0x0/0x1bfc00000, data 0x57270a5/0x594f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458973184 unmapped: 68157440 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5152737 data_alloc: 234881024 data_used: 32755712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458973184 unmapped: 68157440 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458973184 unmapped: 68157440 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2e9000/0x0/0x1bfc00000, data 0x57270a5/0x594f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5144273 data_alloc: 234881024 data_used: 32759808
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2fc000/0x0/0x1bfc00000, data 0x572a0a5/0x5952000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5144273 data_alloc: 234881024 data_used: 32759808
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2fc000/0x0/0x1bfc00000, data 0x572a0a5/0x5952000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458301440 unmapped: 68829184 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2fc000/0x0/0x1bfc00000, data 0x572a0a5/0x5952000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5144273 data_alloc: 234881024 data_used: 32759808
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.573434830s of 19.795026779s, submitted: 93
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc239d400 session 0x557dc33cad20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f2fb000/0x0/0x1bfc00000, data 0x572b0a5/0x5953000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458309632 unmapped: 68820992 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458326016 unmapped: 68804608 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458326016 unmapped: 68804608 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc011e960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc3156780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458358784 unmapped: 68771840 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc0fe7e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975866 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458366976 unmapped: 68763648 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19fd39000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [1,0,1])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 67674112 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459522048 unmapped: 67608576 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459530240 unmapped: 67600384 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459538432 unmapped: 67592192 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a00e8000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459538432 unmapped: 67592192 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4975514 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459538432 unmapped: 67592192 heap: 527130624 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 35.319103241s of 36.174694061s, submitted: 289
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2304f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0121680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc01214a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc20b52c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc20b5680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459866112 unmapped: 74612736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f63a000/0x0/0x1bfc00000, data 0x53f0000/0x5614000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459866112 unmapped: 74612736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459866112 unmapped: 74612736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f63a000/0x0/0x1bfc00000, data 0x53f0000/0x5614000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f63a000/0x0/0x1bfc00000, data 0x53f0000/0x5614000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459874304 unmapped: 74604544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5066141 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459882496 unmapped: 74596352 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc07450e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc308de00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459882496 unmapped: 74596352 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc01b2d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc13dcd20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459882496 unmapped: 74596352 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459882496 unmapped: 74596352 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5147845 data_alloc: 234881024 data_used: 35192832
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f616000/0x0/0x1bfc00000, data 0x5414000/0x5638000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f616000/0x0/0x1bfc00000, data 0x5414000/0x5638000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5147845 data_alloc: 234881024 data_used: 35192832
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19f616000/0x0/0x1bfc00000, data 0x5414000/0x5638000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 72056832 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 72048640 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 72048640 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5149125 data_alloc: 234881024 data_used: 35225600
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.860935211s of 18.978137970s, submitted: 28
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466042880 unmapped: 68435968 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d7ba000/0x0/0x1bfc00000, data 0x60d0000/0x62f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5267683 data_alloc: 234881024 data_used: 37605376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d7ba000/0x0/0x1bfc00000, data 0x60d0000/0x62f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5268003 data_alloc: 234881024 data_used: 37613568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.521827698s of 10.745787621s, submitted: 103
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc13f6d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc31232c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466100224 unmapped: 68378624 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc23310e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef24000/0x0/0x1bfc00000, data 0x4966000/0x4b8a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4988387 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ef48000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30fd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2c46800 session 0x557dc3123c20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc1370000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0edcf00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.576032639s of 32.002433777s, submitted: 36
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30fd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc13dcd20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc01b2d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc308de00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc07450e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e539000/0x0/0x1bfc00000, data 0x5351000/0x5575000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5068817 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455770112 unmapped: 78708736 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc20b52c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e539000/0x0/0x1bfc00000, data 0x5351000/0x5575000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc01214a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455778304 unmapped: 78700544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455778304 unmapped: 78700544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc0121680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc2304f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455778304 unmapped: 78700544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e538000/0x0/0x1bfc00000, data 0x5351023/0x5576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 455778304 unmapped: 78700544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5072547 data_alloc: 218103808 data_used: 24174592
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e538000/0x0/0x1bfc00000, data 0x5351023/0x5576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5146467 data_alloc: 218103808 data_used: 34623488
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 75849728 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5146467 data_alloc: 218103808 data_used: 34623488
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e538000/0x0/0x1bfc00000, data 0x5351023/0x5576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.228843689s of 18.303384781s, submitted: 21
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460292096 unmapped: 74186752 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc6c000/0x0/0x1bfc00000, data 0x5c1d023/0x5e42000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462307328 unmapped: 72171520 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc4f000/0x0/0x1bfc00000, data 0x5c3a023/0x5e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c14f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5230717 data_alloc: 234881024 data_used: 35557376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d835000/0x0/0x1bfc00000, data 0x5c44023/0x5e69000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5228541 data_alloc: 234881024 data_used: 35627008
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d832000/0x0/0x1bfc00000, data 0x5c47023/0x5e6c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.729228973s of 13.922365189s, submitted: 80
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5228769 data_alloc: 234881024 data_used: 35627008
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d831000/0x0/0x1bfc00000, data 0x5c48023/0x5e6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462315520 unmapped: 72163328 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462323712 unmapped: 72155136 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462323712 unmapped: 72155136 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d831000/0x0/0x1bfc00000, data 0x5c48023/0x5e6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d831000/0x0/0x1bfc00000, data 0x5c48023/0x5e6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462323712 unmapped: 72155136 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5228769 data_alloc: 234881024 data_used: 35627008
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462331904 unmapped: 72146944 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc20c7680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc30fc960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6800 session 0x557dc30f0b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dcb9f5c00 session 0x557dc01a4b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc3122b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462356480 unmapped: 72122368 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462356480 unmapped: 72122368 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462364672 unmapped: 72114176 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462364672 unmapped: 72114176 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5306037 data_alloc: 234881024 data_used: 35627008
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462364672 unmapped: 72114176 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462364672 unmapped: 72114176 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc138b4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462372864 unmapped: 72105984 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 72097792 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467673088 unmapped: 66805760 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5377397 data_alloc: 234881024 data_used: 45633536
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467673088 unmapped: 66805760 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.127248764s of 18.263854980s, submitted: 14
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5377573 data_alloc: 234881024 data_used: 45633536
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19ce82000/0x0/0x1bfc00000, data 0x65f7023/0x681c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467689472 unmapped: 66789376 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5377573 data_alloc: 234881024 data_used: 45633536
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 62103552 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470654976 unmapped: 63823872 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c4d1000/0x0/0x1bfc00000, data 0x6fa2023/0x71c7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472195072 unmapped: 62283776 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472195072 unmapped: 62283776 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472195072 unmapped: 62283776 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5475871 data_alloc: 234881024 data_used: 47382528
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 62103552 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c4b1000/0x0/0x1bfc00000, data 0x6fba023/0x71df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 62103552 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 62103552 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc150e5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6800 session 0x557dc13dc780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.607994080s of 15.813341141s, submitted: 89
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 62095360 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fd400 session 0x557dc0e8be00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 62095360 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5238241 data_alloc: 234881024 data_used: 35627008
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 62095360 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 62095360 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d831000/0x0/0x1bfc00000, data 0x5c48023/0x5e6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 62095360 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc2082960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466550784 unmapped: 67928064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc33cb4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009345 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009345 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466567168 unmapped: 67911680 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009345 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009345 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5009345 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc011e960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc30ea780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc3156f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc138a1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 29.595907211s of 29.741331100s, submitted: 59
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0e554a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30eb680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc22ce960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc05d8b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc05d81e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e717000/0x0/0x1bfc00000, data 0x4d62010/0x4f87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466575360 unmapped: 67903488 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5047689 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e717000/0x0/0x1bfc00000, data 0x4d62010/0x4f87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e717000/0x0/0x1bfc00000, data 0x4d62010/0x4f87000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc22ce960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5047689 data_alloc: 218103808 data_used: 24088576
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0e554a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466583552 unmapped: 67895296 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc3156f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc30ea780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466886656 unmapped: 67592192 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466886656 unmapped: 67592192 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084440 data_alloc: 218103808 data_used: 28332032
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5084440 data_alloc: 218103808 data_used: 28332032
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466894848 unmapped: 67584000 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6f2000/0x0/0x1bfc00000, data 0x4d86020/0x4fac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.227991104s of 21.312829971s, submitted: 14
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468475904 unmapped: 66002944 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5129090 data_alloc: 218103808 data_used: 28553216
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5132844 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5132860 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467681280 unmapped: 66797568 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467689472 unmapped: 66789376 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467689472 unmapped: 66789376 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5132860 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5132860 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467697664 unmapped: 66781184 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5132860 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467705856 unmapped: 66772992 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e177000/0x0/0x1bfc00000, data 0x5301020/0x5527000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467714048 unmapped: 66764800 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 28.342596054s of 28.506412506s, submitted: 48
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc308c960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc0120780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6800 session 0x557dc20c70e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc72e4c00 session 0x557dc13dc3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc2304f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467755008 unmapped: 66723840 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467755008 unmapped: 66723840 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5174592 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc6c000/0x0/0x1bfc00000, data 0x580b082/0x5a32000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc05d8960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc301b0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6800 session 0x557dc0e8a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc7bf6c00 session 0x557dc0734d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc6b000/0x0/0x1bfc00000, data 0x580c082/0x5a33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5201765 data_alloc: 218103808 data_used: 32505856
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc6b000/0x0/0x1bfc00000, data 0x580c082/0x5a33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5213125 data_alloc: 218103808 data_used: 34119680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc6b000/0x0/0x1bfc00000, data 0x580c082/0x5a33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467763200 unmapped: 66715648 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5213125 data_alloc: 218103808 data_used: 34119680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.470623016s of 17.634029388s, submitted: 41
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471228416 unmapped: 63250432 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471007232 unmapped: 63471616 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d5a7000/0x0/0x1bfc00000, data 0x5ed0082/0x60f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471072768 unmapped: 63406080 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471072768 unmapped: 63406080 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471072768 unmapped: 63406080 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5280329 data_alloc: 218103808 data_used: 34820096
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471138304 unmapped: 63340544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d5a7000/0x0/0x1bfc00000, data 0x5ed0082/0x60f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471138304 unmapped: 63340544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471138304 unmapped: 63340544 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5276949 data_alloc: 218103808 data_used: 34824192
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d5a5000/0x0/0x1bfc00000, data 0x5ed2082/0x60f9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.511910439s of 14.040276527s, submitted: 72
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d5a4000/0x0/0x1bfc00000, data 0x5ed3082/0x60fa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5277177 data_alloc: 218103808 data_used: 34824192
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471154688 unmapped: 63324160 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc011e5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc0121e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471162880 unmapped: 63315968 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc208a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5141183 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19de30000/0x0/0x1bfc00000, data 0x5302020/0x5528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19de30000/0x0/0x1bfc00000, data 0x5302020/0x5528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5141183 data_alloc: 218103808 data_used: 28835840
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19de30000/0x0/0x1bfc00000, data 0x5302020/0x5528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc13dc780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.882023811s of 12.038696289s, submitted: 56
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc138b4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0120d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468582400 unmapped: 65896448 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468590592 unmapped: 65888256 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468590592 unmapped: 65888256 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468590592 unmapped: 65888256 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468590592 unmapped: 65888256 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468590592 unmapped: 65888256 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5026516 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468598784 unmapped: 65880064 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468606976 unmapped: 65871872 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 37.292110443s of 37.377944946s, submitted: 31
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469270528 unmapped: 65208320 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc31565a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc208ad20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc05d9a40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc05d8000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc07352c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469278720 unmapped: 65200128 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5065424 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469278720 unmapped: 65200128 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469278720 unmapped: 65200128 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:18.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2095860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469278720 unmapped: 65200128 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc22cf680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469278720 unmapped: 65200128 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0e552c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc33cb2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e76b000/0x0/0x1bfc00000, data 0x4d0f000/0x4f33000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469213184 unmapped: 65265664 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5069474 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469213184 unmapped: 65265664 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e746000/0x0/0x1bfc00000, data 0x4d33010/0x4f58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469213184 unmapped: 65265664 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e746000/0x0/0x1bfc00000, data 0x4d33010/0x4f58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096514 data_alloc: 218103808 data_used: 27635712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e746000/0x0/0x1bfc00000, data 0x4d33010/0x4f58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469221376 unmapped: 65257472 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc13f74a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0745680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096514 data_alloc: 218103808 data_used: 27635712
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.558765411s of 16.670768738s, submitted: 15
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0145680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5033069 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5033069 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466747392 unmapped: 67731456 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466755584 unmapped: 67723264 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466755584 unmapped: 67723264 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5033069 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466755584 unmapped: 67723264 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466755584 unmapped: 67723264 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466755584 unmapped: 67723264 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30f1c20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc011e000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc011ed20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc138a1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.058893204s of 18.565244675s, submitted: 15
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466763776 unmapped: 67715072 heap: 534478848 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e070000/0x0/0x1bfc00000, data 0x5409029/0x562e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,7])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc01b25a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0eddc20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc150e5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467001344 unmapped: 71680000 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5126150 data_alloc: 218103808 data_used: 24023040
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e070000/0x0/0x1bfc00000, data 0x5409029/0x562e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc1371e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0e8a780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467001344 unmapped: 71680000 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467001344 unmapped: 71680000 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467001344 unmapped: 71680000 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e026000/0x0/0x1bfc00000, data 0x5453062/0x5678000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467001344 unmapped: 71680000 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30fcb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467009536 unmapped: 71671808 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127446 data_alloc: 218103808 data_used: 24150016
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467017728 unmapped: 71663616 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e026000/0x0/0x1bfc00000, data 0x5453062/0x5678000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467017728 unmapped: 71663616 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc3157e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc1a563c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459399168 unmapped: 79282176 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459407360 unmapped: 79273984 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5040354 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459407360 unmapped: 79273984 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc01a43c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0121680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459407360 unmapped: 79273984 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc1a56f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc2320780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 52.363605499s of 53.645584106s, submitted: 61
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc13dda40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0edd860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30ea1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc22ebe00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25ad000 session 0x557dc138b2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5080213 data_alloc: 218103808 data_used: 23830528
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6a3000/0x0/0x1bfc00000, data 0x4dd6010/0x4ffb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc05d8b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30f01e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc2304b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc01443c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5083129 data_alloc: 218103808 data_used: 23830528
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 459497472 unmapped: 79183872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e67f000/0x0/0x1bfc00000, data 0x4dfa010/0x501f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5109501 data_alloc: 218103808 data_used: 27508736
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e67f000/0x0/0x1bfc00000, data 0x4dfa010/0x501f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460480512 unmapped: 78200832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460488704 unmapped: 78192640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460488704 unmapped: 78192640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5109501 data_alloc: 218103808 data_used: 27508736
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e67f000/0x0/0x1bfc00000, data 0x4dfa010/0x501f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460488704 unmapped: 78192640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460488704 unmapped: 78192640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.432371140s of 20.581054688s, submitted: 20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 77922304 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461283328 unmapped: 77398016 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e37a000/0x0/0x1bfc00000, data 0x50ff010/0x5324000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5189025 data_alloc: 218103808 data_used: 27648000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dd8d000/0x0/0x1bfc00000, data 0x56eb010/0x5910000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5189025 data_alloc: 218103808 data_used: 27648000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462381056 unmapped: 76300288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dd8d000/0x0/0x1bfc00000, data 0x56eb010/0x5910000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462389248 unmapped: 76292096 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5189025 data_alloc: 218103808 data_used: 27648000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462389248 unmapped: 76292096 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dd8d000/0x0/0x1bfc00000, data 0x56eb010/0x5910000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462389248 unmapped: 76292096 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.622055054s of 15.069800377s, submitted: 37
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6800 session 0x557dc0edd0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc4082800 session 0x557dc0e8be00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462389248 unmapped: 76292096 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc3156f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462397440 unmapped: 76283904 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462405632 unmapped: 76275712 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462413824 unmapped: 76267520 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462413824 unmapped: 76267520 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462413824 unmapped: 76267520 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 76259328 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462422016 unmapped: 76259328 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462430208 unmapped: 76251136 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5052885 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462438400 unmapped: 76242944 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462438400 unmapped: 76242944 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462446592 unmapped: 76234752 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.448078156s of 30.543226242s, submitted: 36
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc23043c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc23303c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242c800 session 0x557dc31223c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc20b5680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0edc960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462561280 unmapped: 76120064 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462561280 unmapped: 76120064 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5089967 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462561280 unmapped: 76120064 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462561280 unmapped: 76120064 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6a4000/0x0/0x1bfc00000, data 0x4dd6000/0x4ffa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462561280 unmapped: 76120064 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e6a4000/0x0/0x1bfc00000, data 0x4dd6000/0x4ffa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462569472 unmapped: 76111872 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc301ba40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462716928 unmapped: 75964416 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5092355 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462716928 unmapped: 75964416 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e680000/0x0/0x1bfc00000, data 0x4dfa000/0x501e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5118595 data_alloc: 218103808 data_used: 27504640
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e680000/0x0/0x1bfc00000, data 0x4dfa000/0x501e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e680000/0x0/0x1bfc00000, data 0x4dfa000/0x501e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5118595 data_alloc: 218103808 data_used: 27504640
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463077376 unmapped: 75603968 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.218767166s of 18.299646378s, submitted: 8
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc308cb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2184000 session 0x557dc308cf00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc138a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc05d8780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463110144 unmapped: 75571200 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc20821e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d98d000/0x0/0x1bfc00000, data 0x5aeb062/0x5d10000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463134720 unmapped: 75546624 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463134720 unmapped: 75546624 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463134720 unmapped: 75546624 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5228936 data_alloc: 218103808 data_used: 27914240
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463134720 unmapped: 75546624 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463134720 unmapped: 75546624 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d987000/0x0/0x1bfc00000, data 0x5af1062/0x5d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463142912 unmapped: 75538432 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc2320d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463142912 unmapped: 75538432 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463183872 unmapped: 75497472 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5233416 data_alloc: 218103808 data_used: 28471296
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d987000/0x0/0x1bfc00000, data 0x5af1062/0x5d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5277096 data_alloc: 218103808 data_used: 34607104
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d987000/0x0/0x1bfc00000, data 0x5af1062/0x5d16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464429056 unmapped: 74252288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5277096 data_alloc: 218103808 data_used: 34607104
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 67K writes, 258K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.04 MB/s#012Cumulative WAL: 67K writes, 25K syncs, 2.63 writes per sync, written: 0.24 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2415 writes, 9201 keys, 2415 commit groups, 1.0 writes per commit group, ingest: 8.90 MB, 0.01 MB/s#012Interval WAL: 2415 writes, 999 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.008       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 0.000216 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557dbecdef30#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 0.000216 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction,
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464461824 unmapped: 74219520 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.664016724s of 19.928941727s, submitted: 58
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465453056 unmapped: 73228288 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466599936 unmapped: 72081408 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d244000/0x0/0x1bfc00000, data 0x622f062/0x6454000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466599936 unmapped: 72081408 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5348170 data_alloc: 234881024 data_used: 35033088
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d223000/0x0/0x1bfc00000, data 0x6248062/0x646d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d223000/0x0/0x1bfc00000, data 0x6248062/0x646d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d223000/0x0/0x1bfc00000, data 0x6248062/0x646d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5348170 data_alloc: 234881024 data_used: 35033088
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d223000/0x0/0x1bfc00000, data 0x6248062/0x646d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5348170 data_alloc: 234881024 data_used: 35033088
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466632704 unmapped: 72048640 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d223000/0x0/0x1bfc00000, data 0x6248062/0x646d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.767245293s of 14.978918076s, submitted: 83
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc75a1400 session 0x557dc3156780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc33ca960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc05d92c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5186456 data_alloc: 218103808 data_used: 27914240
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc60000/0x0/0x1bfc00000, data 0x548f000/0x56b3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc4082800 session 0x557dc30f03c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3f24000 session 0x557dc0fe61e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465600512 unmapped: 73080832 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc138a000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461627392 unmapped: 77053952 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461635584 unmapped: 77045760 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461643776 unmapped: 77037568 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461643776 unmapped: 77037568 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461643776 unmapped: 77037568 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461643776 unmapped: 77037568 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb38000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 77029376 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461660160 unmapped: 77021184 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461660160 unmapped: 77021184 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461660160 unmapped: 77021184 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070490 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc33ca5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0e8be00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc1a56b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3f24000 session 0x557dc13f6b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 59.841094971s of 59.972972870s, submitted: 43
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461660160 unmapped: 77021184 heap: 538681344 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942010/0x4b67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc4082800 session 0x557dc1371e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0e54b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc13dc780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc011e5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3f24000 session 0x557dc0734780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461668352 unmapped: 81215488 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461668352 unmapped: 81215488 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461668352 unmapped: 81215488 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461668352 unmapped: 81215488 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157747 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e034000/0x0/0x1bfc00000, data 0x5444072/0x566a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461676544 unmapped: 81207296 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461676544 unmapped: 81207296 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc3122d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 461651968 unmapped: 81231872 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462372864 unmapped: 80510976 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e010000/0x0/0x1bfc00000, data 0x5468072/0x568e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e010000/0x0/0x1bfc00000, data 0x5468072/0x568e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5242069 data_alloc: 234881024 data_used: 35377152
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e010000/0x0/0x1bfc00000, data 0x5468072/0x568e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.630780220s of 12.218190193s, submitted: 30
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e010000/0x0/0x1bfc00000, data 0x5468072/0x568e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e010000/0x0/0x1bfc00000, data 0x5468072/0x568e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 463224832 unmapped: 79659008 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5242141 data_alloc: 234881024 data_used: 35377152
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464347136 unmapped: 78536704 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464371712 unmapped: 78512128 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464371712 unmapped: 78512128 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464371712 unmapped: 78512128 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 464617472 unmapped: 78266368 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dee7000/0x0/0x1bfc00000, data 0x5591072/0x57b7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,1,3])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5280509 data_alloc: 234881024 data_used: 35397632
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465010688 unmapped: 77873152 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465084416 unmapped: 77799424 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465092608 unmapped: 77791232 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465092608 unmapped: 77791232 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d83f000/0x0/0x1bfc00000, data 0x5c39072/0x5e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5313851 data_alloc: 234881024 data_used: 36397056
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d83f000/0x0/0x1bfc00000, data 0x5c39072/0x5e5f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5313851 data_alloc: 234881024 data_used: 36397056
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465100800 unmapped: 77783040 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.592128754s of 18.461534500s, submitted: 292
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0fe7e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0fe6960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc011fe00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d840000/0x0/0x1bfc00000, data 0x5c39062/0x5e5e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465108992 unmapped: 77774848 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465117184 unmapped: 77766656 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465125376 unmapped: 77758464 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5082456 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19eb37000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465133568 unmapped: 77750272 heap: 542883840 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 35.520748138s of 35.597747803s, submitted: 29
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc011f0e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3f24000 session 0x557dc2082d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc011fc20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc20954a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30f1c20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465149952 unmapped: 81403904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465149952 unmapped: 81403904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc53000/0x0/0x1bfc00000, data 0x5826062/0x5a4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465149952 unmapped: 81403904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5195204 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465149952 unmapped: 81403904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc53000/0x0/0x1bfc00000, data 0x5826062/0x5a4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5195204 data_alloc: 218103808 data_used: 23826432
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc2082d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc011fe00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc53000/0x0/0x1bfc00000, data 0x5826062/0x5a4b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc0fe6960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0fe7e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465158144 unmapped: 81395712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.223106384s of 10.457739830s, submitted: 40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465174528 unmapped: 81379328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 465190912 unmapped: 81362944 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466018304 unmapped: 80535552 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5309994 data_alloc: 234881024 data_used: 38117376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc2e000/0x0/0x1bfc00000, data 0x584a072/0x5a70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466018304 unmapped: 80535552 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466018304 unmapped: 80535552 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466018304 unmapped: 80535552 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466026496 unmapped: 80527360 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466034688 unmapped: 80519168 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5309994 data_alloc: 234881024 data_used: 38117376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc2e000/0x0/0x1bfc00000, data 0x584a072/0x5a70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466034688 unmapped: 80519168 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466034688 unmapped: 80519168 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19dc2e000/0x0/0x1bfc00000, data 0x584a072/0x5a70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466042880 unmapped: 80510976 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466042880 unmapped: 80510976 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 466042880 unmapped: 80510976 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5310794 data_alloc: 234881024 data_used: 38137856
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.728159904s of 12.743644714s, submitted: 1
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468926464 unmapped: 77627392 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2ee000/0x0/0x1bfc00000, data 0x618a072/0x63b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5401744 data_alloc: 234881024 data_used: 39141376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468934656 unmapped: 77619200 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5402064 data_alloc: 234881024 data_used: 39149568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5402064 data_alloc: 234881024 data_used: 39149568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468942848 unmapped: 77611008 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5402064 data_alloc: 234881024 data_used: 39149568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468951040 unmapped: 77602816 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468959232 unmapped: 77594624 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5402064 data_alloc: 234881024 data_used: 39149568
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468959232 unmapped: 77594624 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 26.165599823s of 26.330036163s, submitted: 72
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc301a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc72e5000 session 0x557dc23303c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc6f24000 session 0x557dc20950e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0734f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc138b2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd90000/0x0/0x1bfc00000, data 0x66e8072/0x690e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5438075 data_alloc: 234881024 data_used: 39153664
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd90000/0x0/0x1bfc00000, data 0x66e8072/0x690e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd90000/0x0/0x1bfc00000, data 0x66e8072/0x690e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468983808 unmapped: 77570048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd90000/0x0/0x1bfc00000, data 0x66e8072/0x690e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468992000 unmapped: 77561856 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5438075 data_alloc: 234881024 data_used: 39153664
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469000192 unmapped: 77553664 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.135008812s of 10.212894440s, submitted: 14
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc05d8960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469000192 unmapped: 77553664 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469000192 unmapped: 77553664 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd6c000/0x0/0x1bfc00000, data 0x670c072/0x6932000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5480787 data_alloc: 234881024 data_used: 44744704
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd6c000/0x0/0x1bfc00000, data 0x670c072/0x6932000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5480787 data_alloc: 234881024 data_used: 44744704
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cd6c000/0x0/0x1bfc00000, data 0x670c072/0x6932000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.110620499s of 11.345820427s, submitted: 2
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 74948608 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5481679 data_alloc: 234881024 data_used: 44765184
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473464832 unmapped: 73089024 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cb46000/0x0/0x1bfc00000, data 0x6932072/0x6b58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5521285 data_alloc: 234881024 data_used: 45551616
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cb46000/0x0/0x1bfc00000, data 0x6932072/0x6b58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cb46000/0x0/0x1bfc00000, data 0x6932072/0x6b58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5521285 data_alloc: 234881024 data_used: 45551616
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cb46000/0x0/0x1bfc00000, data 0x6932072/0x6b58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.284305573s of 14.851020813s, submitted: 25
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473497600 unmapped: 73056256 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473505792 unmapped: 73048064 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473505792 unmapped: 73048064 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5522885 data_alloc: 234881024 data_used: 45682688
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cb46000/0x0/0x1bfc00000, data 0x6932072/0x6b58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473505792 unmapped: 73048064 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc72e5000 session 0x557dc3157c20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc13dcf00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc0edc000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5406801 data_alloc: 234881024 data_used: 39215104
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5406801 data_alloc: 234881024 data_used: 39215104
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d2e4000/0x0/0x1bfc00000, data 0x6194072/0x63ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.167324066s of 17.263990402s, submitted: 33
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0734780
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0e54b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473522176 unmapped: 73031680 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5406729 data_alloc: 234881024 data_used: 39215104
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc33ca1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 84508672 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 84508672 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 84508672 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462045184 unmapped: 84508672 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104768 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104768 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104768 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104768 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462053376 unmapped: 84500480 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5104768 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 26.696624756s of 26.810197830s, submitted: 46
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2513c00 session 0x557dc011e000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e13e000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2094b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc208b4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dbfac3e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc31561e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e10e000/0x0/0x1bfc00000, data 0x536b062/0x5590000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e10e000/0x0/0x1bfc00000, data 0x536b062/0x5590000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e10e000/0x0/0x1bfc00000, data 0x536b062/0x5590000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5188273 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462069760 unmapped: 84484096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462077952 unmapped: 84475904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462077952 unmapped: 84475904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462077952 unmapped: 84475904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e10e000/0x0/0x1bfc00000, data 0x536b062/0x5590000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462077952 unmapped: 84475904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5188273 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc01443c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462077952 unmapped: 84475904 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc2082b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc01b9860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462086144 unmapped: 84467712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.539037704s of 10.654681206s, submitted: 35
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30f0b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462094336 unmapped: 84459520 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462094336 unmapped: 84459520 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e0e9000/0x0/0x1bfc00000, data 0x538f072/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5258851 data_alloc: 218103808 data_used: 33280000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e0e9000/0x0/0x1bfc00000, data 0x538f072/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5258851 data_alloc: 218103808 data_used: 33280000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e0e9000/0x0/0x1bfc00000, data 0x538f072/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e0e9000/0x0/0x1bfc00000, data 0x538f072/0x55b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 462102528 unmapped: 84451328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5259331 data_alloc: 218103808 data_used: 33292288
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.320911407s of 13.332280159s, submitted: 2
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 467959808 unmapped: 78594048 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cece000/0x0/0x1bfc00000, data 0x6199072/0x63bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cecb000/0x0/0x1bfc00000, data 0x619d072/0x63c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5376943 data_alloc: 218103808 data_used: 34361344
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cecb000/0x0/0x1bfc00000, data 0x619d072/0x63c3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5376923 data_alloc: 218103808 data_used: 34365440
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec9000/0x0/0x1bfc00000, data 0x619f072/0x63c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec9000/0x0/0x1bfc00000, data 0x619f072/0x63c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc10ab400 session 0x557dc3123e00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.527844429s of 12.793952942s, submitted: 111
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2367c00 session 0x557dc0e550e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc242dc00 session 0x557dc0f1d2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5377151 data_alloc: 218103808 data_used: 34365440
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec8000/0x0/0x1bfc00000, data 0x61a0072/0x63c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468205568 unmapped: 78348288 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468213760 unmapped: 78340096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5377151 data_alloc: 218103808 data_used: 34365440
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468213760 unmapped: 78340096 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc7488c00 session 0x557dc308cb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c7800 session 0x557dc234a1e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc3156b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0fe6b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc01b90e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec8000/0x0/0x1bfc00000, data 0x61a0072/0x63c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5426809 data_alloc: 218103808 data_used: 34365440
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.988913536s of 13.046833992s, submitted: 13
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc7488c00 session 0x557dc3122000
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468353024 unmapped: 78200832 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c90c000/0x0/0x1bfc00000, data 0x675b095/0x6982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468451328 unmapped: 78102528 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5462955 data_alloc: 234881024 data_used: 38064128
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c90c000/0x0/0x1bfc00000, data 0x675b095/0x6982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c90c000/0x0/0x1bfc00000, data 0x675b095/0x6982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c90c000/0x0/0x1bfc00000, data 0x675b095/0x6982000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5463131 data_alloc: 234881024 data_used: 38064128
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c90b000/0x0/0x1bfc00000, data 0x675c095/0x6983000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 468484096 unmapped: 78069760 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.383593559s of 12.423521042s, submitted: 12
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469630976 unmapped: 76922880 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c7ed000/0x0/0x1bfc00000, data 0x686c095/0x6a93000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c7ed000/0x0/0x1bfc00000, data 0x686c095/0x6a93000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469630976 unmapped: 76922880 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5478097 data_alloc: 234881024 data_used: 38313984
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469630976 unmapped: 76922880 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469630976 unmapped: 76922880 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469630976 unmapped: 76922880 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469639168 unmapped: 76914688 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c7ce000/0x0/0x1bfc00000, data 0x6882095/0x6aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469639168 unmapped: 76914688 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5485747 data_alloc: 234881024 data_used: 38129664
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c7ce000/0x0/0x1bfc00000, data 0x6882095/0x6aa9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469639168 unmapped: 76914688 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469639168 unmapped: 76914688 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6000 session 0x557dc011e3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc13dcd20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc0e552c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec6000/0x0/0x1bfc00000, data 0x61a1072/0x63c7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5385274 data_alloc: 218103808 data_used: 33284096
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cec6000/0x0/0x1bfc00000, data 0x61a1072/0x63c7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469008384 unmapped: 77545472 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.639362335s of 14.880970955s, submitted: 82
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc13f6f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc138a3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc308cd20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469032960 unmapped: 77520896 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469041152 unmapped: 77512704 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469041152 unmapped: 77512704 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469049344 unmapped: 77504512 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e044000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 77496320 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5127069 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc2304f00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc2304b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc07352c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc31223c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.155056000s of 27.264543533s, submitted: 31
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc0f1c3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc33cb4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dbfb2fc00 session 0x557dc3123680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc20c63c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc2320d20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e605000/0x0/0x1bfc00000, data 0x4a65000/0x4c89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5143497 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470278144 unmapped: 76275712 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e605000/0x0/0x1bfc00000, data 0x4a65000/0x4c89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470286336 unmapped: 76267520 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc011bc00 session 0x557dc05d9860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470286336 unmapped: 76267520 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e605000/0x0/0x1bfc00000, data 0x4a65000/0x4c89000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc13f72c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470286336 unmapped: 76267520 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5147362 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470294528 unmapped: 76259328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470294528 unmapped: 76259328 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e5e0000/0x0/0x1bfc00000, data 0x4a89023/0x4cae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5154882 data_alloc: 218103808 data_used: 24670208
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e5e0000/0x0/0x1bfc00000, data 0x4a89023/0x4cae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5154882 data_alloc: 218103808 data_used: 24670208
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e5e0000/0x0/0x1bfc00000, data 0x4a89023/0x4cae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 470302720 unmapped: 76251136 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.634801865s of 21.770465851s, submitted: 16
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474480640 unmapped: 72073216 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475955200 unmapped: 70598656 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5263596 data_alloc: 218103808 data_used: 25063424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475955200 unmapped: 70598656 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475955200 unmapped: 70598656 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9e9000/0x0/0x1bfc00000, data 0x5672023/0x5897000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475955200 unmapped: 70598656 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476020736 unmapped: 70533120 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476020736 unmapped: 70533120 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5252004 data_alloc: 218103808 data_used: 25063424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 71811072 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 71811072 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 71811072 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9d6000/0x0/0x1bfc00000, data 0x5693023/0x58b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 71811072 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474742784 unmapped: 71811072 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.431081772s of 12.760804176s, submitted: 122
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5252140 data_alloc: 218103808 data_used: 25063424
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474701824 unmapped: 71852032 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474701824 unmapped: 71852032 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc30fc3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc30eb860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9c8000/0x0/0x1bfc00000, data 0x56a1023/0x58c6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,1])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc3157860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5137658 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5137658 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474710016 unmapped: 71843840 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5137658 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5137658 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474718208 unmapped: 71835648 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 71819264 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 71819264 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19e727000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 71819264 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474734592 unmapped: 71819264 heap: 546553856 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.476037979s of 24.546947479s, submitted: 30
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6000 session 0x557dc22cf2c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30f14a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc0fe74a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6000 session 0x557dc234ad20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc301b4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5245776 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475971584 unmapped: 74260480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475971584 unmapped: 74260480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475971584 unmapped: 74260480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9b5000/0x0/0x1bfc00000, data 0x56b5000/0x58d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475971584 unmapped: 74260480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475979776 unmapped: 74252288 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5245776 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475979776 unmapped: 74252288 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9b5000/0x0/0x1bfc00000, data 0x56b5000/0x58d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475979776 unmapped: 74252288 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc0fe61e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc0735860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475979776 unmapped: 74252288 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475979776 unmapped: 74252288 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc20943c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6000 session 0x557dc0d3f680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9b5000/0x0/0x1bfc00000, data 0x56b5000/0x58d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475987968 unmapped: 74244096 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5245776 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475987968 unmapped: 74244096 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475987968 unmapped: 74244096 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475922432 unmapped: 74309632 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9b5000/0x0/0x1bfc00000, data 0x56b5000/0x58d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5343376 data_alloc: 234881024 data_used: 37441536
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d9b5000/0x0/0x1bfc00000, data 0x56b5000/0x58d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5343376 data_alloc: 234881024 data_used: 37441536
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 476225536 unmapped: 74006528 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.191040039s of 23.264764786s, submitted: 20
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 478543872 unmapped: 71688192 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479035392 unmapped: 71196672 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cf05000/0x0/0x1bfc00000, data 0x6165000/0x6389000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5441608 data_alloc: 234881024 data_used: 39387136
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479043584 unmapped: 71188480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479043584 unmapped: 71188480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479043584 unmapped: 71188480 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cf05000/0x0/0x1bfc00000, data 0x6165000/0x6389000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5432300 data_alloc: 234881024 data_used: 39387136
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cee4000/0x0/0x1bfc00000, data 0x6186000/0x63aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479141888 unmapped: 71090176 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.025492668s of 12.276173592s, submitted: 110
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5432856 data_alloc: 234881024 data_used: 39395328
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480190464 unmapped: 70041600 heap: 550232064 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19cedd000/0x0/0x1bfc00000, data 0x618d000/0x63b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc150f680
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc7488c00 session 0x557dc22eb4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc30ea3c0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc208b4a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc25c6000 session 0x557dc308c5a0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c410000/0x0/0x1bfc00000, data 0x6c5a000/0x6e7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c410000/0x0/0x1bfc00000, data 0x6c5a000/0x6e7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5521156 data_alloc: 234881024 data_used: 39395328
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc33fdc00 session 0x557dc33ca960
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c410000/0x0/0x1bfc00000, data 0x6c5a000/0x6e7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c40d000/0x0/0x1bfc00000, data 0x6c5d000/0x6e81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 69263360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5599248 data_alloc: 234881024 data_used: 50343936
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 69263360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 69263360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c40d000/0x0/0x1bfc00000, data 0x6c5d000/0x6e81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 69263360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485171200 unmapped: 69263360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5599888 data_alloc: 234881024 data_used: 50405376
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.097406387s of 17.195215225s, submitted: 19
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19c40d000/0x0/0x1bfc00000, data 0x6c5d000/0x6e81000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 485212160 unmapped: 69222400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19bc56000/0x0/0x1bfc00000, data 0x740c000/0x7630000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5665978 data_alloc: 234881024 data_used: 51638272
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489594880 unmapped: 64839680 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489594880 unmapped: 64839680 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489897984 unmapped: 64536576 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19aa2c000/0x0/0x1bfc00000, data 0x7496000/0x76ba000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 490004480 unmapped: 64430080 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 65101824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5669434 data_alloc: 234881024 data_used: 51773440
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 65101824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 65101824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 65101824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19aa10000/0x0/0x1bfc00000, data 0x74ba000/0x76de000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489332736 unmapped: 65101824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19aa0f000/0x0/0x1bfc00000, data 0x74bb000/0x76df000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3f25c00 session 0x557dc0744b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2652c00 session 0x557dc234be00
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.671845436s of 12.190391541s, submitted: 115
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489242624 unmapped: 65191936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc20ef400 session 0x557dc011eb40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5444956 data_alloc: 234881024 data_used: 39456768
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489242624 unmapped: 65191936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489242624 unmapped: 65191936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19bd2e000/0x0/0x1bfc00000, data 0x619c000/0x63c0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489242624 unmapped: 65191936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489242624 unmapped: 65191936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc3410000 session 0x557dc1a56b40
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc34c0000 session 0x557dc07441e0
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 489250816 unmapped: 65183744 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 ms_handle_reset con 0x557dc2419c00 session 0x557dc301b860
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477487104 unmapped: 76947456 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477495296 unmapped: 76939264 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477503488 unmapped: 76931072 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477511680 unmapped: 76922880 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477519872 unmapped: 76914688 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477528064 unmapped: 76906496 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477536256 unmapped: 76898304 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477536256 unmapped: 76898304 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477536256 unmapped: 76898304 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477536256 unmapped: 76898304 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 69K writes, 268K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.04 MB/s#012Cumulative WAL: 69K writes, 26K syncs, 2.63 writes per sync, written: 0.26 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2415 writes, 9981 keys, 2415 commit groups, 1.0 writes per commit group, ingest: 10.97 MB, 0.02 MB/s#012Interval WAL: 2415 writes, 949 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477544448 unmapped: 76890112 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477552640 unmapped: 76881920 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477560832 unmapped: 76873728 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477560832 unmapped: 76873728 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477560832 unmapped: 76873728 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477560832 unmapped: 76873728 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477569024 unmapped: 76865536 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477569024 unmapped: 76865536 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477577216 unmapped: 76857344 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477593600 unmapped: 76840960 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477601792 unmapped: 76832768 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477601792 unmapped: 76832768 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477601792 unmapped: 76832768 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477618176 unmapped: 76816384 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 76808192 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 76808192 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 76808192 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477626368 unmapped: 76808192 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477634560 unmapped: 76800000 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477650944 unmapped: 76783616 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477659136 unmapped: 76775424 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477667328 unmapped: 76767232 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477675520 unmapped: 76759040 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477683712 unmapped: 76750848 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477691904 unmapped: 76742656 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477691904 unmapped: 76742656 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477700096 unmapped: 76734464 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477700096 unmapped: 76734464 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477700096 unmapped: 76734464 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477700096 unmapped: 76734464 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477700096 unmapped: 76734464 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477716480 unmapped: 76718080 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477716480 unmapped: 76718080 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477716480 unmapped: 76718080 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477716480 unmapped: 76718080 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477724672 unmapped: 76709888 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477724672 unmapped: 76709888 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477724672 unmapped: 76709888 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477724672 unmapped: 76709888 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477732864 unmapped: 76701696 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477732864 unmapped: 76701696 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477732864 unmapped: 76701696 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 187.584411621s of 187.672454834s, submitted: 38
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477749248 unmapped: 76685312 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 477757440 unmapped: 76677120 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 478830592 unmapped: 75603968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 479985664 unmapped: 74448896 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480002048 unmapped: 74432512 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480010240 unmapped: 74424320 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480018432 unmapped: 74416128 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480026624 unmapped: 74407936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480026624 unmapped: 74407936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480026624 unmapped: 74407936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480026624 unmapped: 74407936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480026624 unmapped: 74407936 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480034816 unmapped: 74399744 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480034816 unmapped: 74399744 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480034816 unmapped: 74399744 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480043008 unmapped: 74391552 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480051200 unmapped: 74383360 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480059392 unmapped: 74375168 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480067584 unmapped: 74366976 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480067584 unmapped: 74366976 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480067584 unmapped: 74366976 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480075776 unmapped: 74358784 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480083968 unmapped: 74350592 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480092160 unmapped: 74342400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480092160 unmapped: 74342400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480092160 unmapped: 74342400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480100352 unmapped: 74334208 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480108544 unmapped: 74326016 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480116736 unmapped: 74317824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480116736 unmapped: 74317824 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480124928 unmapped: 74309632 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480124928 unmapped: 74309632 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480124928 unmapped: 74309632 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480124928 unmapped: 74309632 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480133120 unmapped: 74301440 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480133120 unmapped: 74301440 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480133120 unmapped: 74301440 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480133120 unmapped: 74301440 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480149504 unmapped: 74285056 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480165888 unmapped: 74268672 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480174080 unmapped: 74260480 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480182272 unmapped: 74252288 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480190464 unmapped: 74244096 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480198656 unmapped: 74235904 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480198656 unmapped: 74235904 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480198656 unmapped: 74235904 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480198656 unmapped: 74235904 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480198656 unmapped: 74235904 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480206848 unmapped: 74227712 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480215040 unmapped: 74219520 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480215040 unmapped: 74219520 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480223232 unmapped: 74211328 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480223232 unmapped: 74211328 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480223232 unmapped: 74211328 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480223232 unmapped: 74211328 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480223232 unmapped: 74211328 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480231424 unmapped: 74203136 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480231424 unmapped: 74203136 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480231424 unmapped: 74203136 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480239616 unmapped: 74194944 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480256000 unmapped: 74178560 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480256000 unmapped: 74178560 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480264192 unmapped: 74170368 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480280576 unmapped: 74153984 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480288768 unmapped: 74145792 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480296960 unmapped: 74137600 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480296960 unmapped: 74137600 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480305152 unmapped: 74129408 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480313344 unmapped: 74121216 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480329728 unmapped: 74104832 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480329728 unmapped: 74104832 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480337920 unmapped: 74096640 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480346112 unmapped: 74088448 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480346112 unmapped: 74088448 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480346112 unmapped: 74088448 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480354304 unmapped: 74080256 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480362496 unmapped: 74072064 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480362496 unmapped: 74072064 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480362496 unmapped: 74072064 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480362496 unmapped: 74072064 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480370688 unmapped: 74063872 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480370688 unmapped: 74063872 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480370688 unmapped: 74063872 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480370688 unmapped: 74063872 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480370688 unmapped: 74063872 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480378880 unmapped: 74055680 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480387072 unmapped: 74047488 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480387072 unmapped: 74047488 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480387072 unmapped: 74047488 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480395264 unmapped: 74039296 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480395264 unmapped: 74039296 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480395264 unmapped: 74039296 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480395264 unmapped: 74039296 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480395264 unmapped: 74039296 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480403456 unmapped: 74031104 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480403456 unmapped: 74031104 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480411648 unmapped: 74022912 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480419840 unmapped: 74014720 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480428032 unmapped: 74006528 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480428032 unmapped: 74006528 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480436224 unmapped: 73998336 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480444416 unmapped: 73990144 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480452608 unmapped: 73981952 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480468992 unmapped: 73965568 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480485376 unmapped: 73949184 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480485376 unmapped: 73949184 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 480485376 unmapped: 73949184 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475758592 unmapped: 78675968 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475766784 unmapped: 78667776 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'config diff' '{prefix=config diff}'
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: bluestore.MempoolThread(0x557dbedbdb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5157674 data_alloc: 218103808 data_used: 23760896
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'config show' '{prefix=config show}'
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 475308032 unmapped: 79126528 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'counter dump' '{prefix=counter dump}'
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'counter schema' '{prefix=counter schema}'
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: osd.1 418 heartbeat osd_stat(store_statfs(0x19d588000/0x0/0x1bfc00000, data 0x4942000/0x4b66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 474652672 unmapped: 79781888 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: prioritycache tune_memory target: 4294967296 mapped: 473948160 unmapped: 80486400 heap: 554434560 old mem: 2845415833 new mem: 2845415833
Jan 20 10:53:18 np0005588919 ceph-osd[79119]: do_command 'log dump' '{prefix=log dump}'
Jan 20 10:53:18 np0005588919 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 20 10:53:18 np0005588919 nova_compute[225855]: 2026-01-20 15:53:18.681 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:18 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:18 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:18 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:18.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 20 10:53:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3089804160' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 20 10:53:19 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 20 10:53:19 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3608108182' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 20 10:53:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:20.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:20 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 20 10:53:20 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/858049734' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 20 10:53:20 np0005588919 nova_compute[225855]: 2026-01-20 15:53:20.315 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:20 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:20 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:20 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:21 np0005588919 podman[337517]: 2026-01-20 15:53:21.085918731 +0000 UTC m=+0.120833706 container health_status 72be95c12e041eb2cd1a16ad70dfa1391d9f03a5f3180a614ead9459dcfcd46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4097280376' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2052119312' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 20 10:53:21 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/502472852' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1585738903' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 20 10:53:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:22.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2813367287' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1825010125' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3075812643' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 20 10:53:22 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:22 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:22 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:22.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 20 10:53:22 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1414603985' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1188902209' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1107210386' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1482769606' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 20 10:53:23 np0005588919 nova_compute[225855]: 2026-01-20 15:53:23.686 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:23 np0005588919 systemd[1]: Starting Hostname Service...
Jan 20 10:53:23 np0005588919 systemd[1]: Started Hostname Service.
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 20 10:53:23 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3218920685' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 20 10:53:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:24.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2496834440' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3119005333' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 20 10:53:24 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:24 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:24 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:24.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 20 10:53:24 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1406159115' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 20 10:53:25 np0005588919 nova_compute[225855]: 2026-01-20 15:53:25.318 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:25 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:26.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:26 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 20 10:53:26 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/921537409' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 20 10:53:26 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:26 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:26 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1876167769' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1931971985' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:27 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:28 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 20 10:53:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1807707528' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 20 10:53:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 20 10:53:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:28.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 20 10:53:28 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:28 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 20 10:53:28 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.100 - anonymous [20/Jan/2026:15:53:28.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 20 10:53:28 np0005588919 nova_compute[225855]: 2026-01-20 15:53:28.721 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 20 10:53:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 20 10:53:28 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 20 10:53:29 np0005588919 podman[338630]: 2026-01-20 15:53:29.08232334 +0000 UTC m=+0.099364219 container health_status 533cd8e0c3b8b7f910f15cef49e7f879ab46c12ce646a35628c15f13725eb500 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '4a32417983ff32267599655c6e45254baefd9d4970135e23c41405384e1081af-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-74c9debe5f9e04ff9ee42a3aa155432affc8a9cb0fab2a60d92b56b70743714b-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 20 10:53:29 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 20 10:53:29 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 20 10:53:29 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 20 10:53:30 np0005588919 radosgw[83787]: ====== starting new request req=0x7f09c6ece6f0 =====
Jan 20 10:53:30 np0005588919 radosgw[83787]: ====== req done req=0x7f09c6ece6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 20 10:53:30 np0005588919 radosgw[83787]: beast: 0x7f09c6ece6f0: 192.168.122.101 - anonymous [20/Jan/2026:15:53:30.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 20 10:53:30 np0005588919 ceph-mon[81775]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Jan 20 10:53:30 np0005588919 ceph-mon[81775]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/608709473' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 20 10:53:30 np0005588919 nova_compute[225855]: 2026-01-20 15:53:30.321 225859 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
